Looks like it has been almost 4 years since my last post. Lots of things happened since then…Probably time to start writing again.
Hopefully I’ve percolated enough interesting content in that time…..more to follow.
Thoughts on tech and random things I find on the web
Looks like it has been almost 4 years since my last post. Lots of things happened since then…Probably time to start writing again.
Hopefully I’ve percolated enough interesting content in that time…..more to follow.
Interesting to imagine that in the world of HFTA, latency relevance is measured in numbers that are orders of magnitude different from when “latency” means in the world of web.
The world is a dynamic mess of jiggling things.
That pretty much explains most things in life.
This post about automation drew my attention. It’s well written and tries to address some of the problems with automation and the general attitude with “automate all things”. However, I don’t think the problem is with automation itself. This goes back to the root problem of complex systems that develop emergent properties, resilience engineering and “black swan” events. The author himself has a great post on the this topic.
When automating a repetitive task, the chance for error and more imporantly the chance for a disproportionately significant impact is very low. When you’re using automation to walk through a complex tree logic, the impact of an error increases considerably. The problem with automating for rare events that include multiple components are:
So, I think it’s the wrong way to talk about the problem. Automation is a secondary factor which amplifies existing problems with system complexity. These are some of the guidelines to follow to design around it:
A couple of relevant articles that are really talking about the same thing:
1. An example from aviation, which has been dealing with complexity and resilience for a long time. The title is very fitting: “Want to build resilience? Kill the Complexity”. Equally applicable in almost every field.
2. Architecture of Robust, Evolvable Networks. That’s an abstract and the actual paper is here. He talks about internet as a whole, but smaller networks are often a microcosm of the very same thing.
I was reading an edition of PenTest Magazine (attached here for convenience). They’ve had a few decent articles in there, but one was talking specifically about securing your users. That’s an interesting topic. An attack against your company is very likely to come through the “meatware” vector. It’s often much easier then trying to find the latest 0-day or buffer overflow. Of course you have your security policies and user training, but even the security pros fall for a well crafted phishing attack. Your expectation of the extent that you’ll be able to harden and train your userbase should be limited. You need to be prepared for a breach to come through that direction.
A lot of defenses should be focused on isolating the user population from critical systems, so that when a breach does occur, the impact is limited. Of course users do need some access in order to perform their jobs and that’s where it’s critical to focus on granular access controls, specifically RBAC. You also need to have the capacity to detect and respond to any anomalies in user behavior. That’s what ultimately will allow you to contain the threat and limit it’s impact.
Recent Comments