Bruce Schneier is high on my list of smart people to pay attention to. His blog, Schneier on Security, always provides useful insights into the interplay between technology and people. Yesterday, he offered an interesting observation about what he labels “the security mindset.”
Schneier on Security: The Security Mindset.
….
Security requires a particular mindset. Security professionals — at least the good ones — see the world differently. They can’t walk into a store without noticing how they might shoplift. They can’t use a computer without wondering about the security vulnerabilities. They can’t vote without trying to figure out how to vote twice. They just can’t help it.
SmartWater is a liquid with a unique identifier linked to a particular owner. “The idea is for me to paint this stuff on my valuables as proof of ownership,” I wrote when I first learned about the idea. “I think a better idea would be for me to paint it on your valuables, and then call the police.”
Really, we can’t help it.
This kind of thinking is not natural for most people. It’s not natural for engineers. Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail. It involves thinking like an attacker, an adversary or a criminal. You don’t have to exploit the vulnerabilities you find, but if you don’t see the world that way, you’ll never notice most security problems.
…
I’d push his observations a bit farther. When you are designing and building systems that incorporate people and technology, you had better think about both how to make things work and about how things might fail.
Human systems are interesting and effective because they are resilient. Good designers allow for the reality of human strengths and weaknesses and factor both into their designs. Too many poor or lazy designers ignore or gloss over failure modes. How many project plans have you seen, for example, that assume no one on the project team will ever be out sick? And then management complains when the project fails to meet its deadlines.
There’s actually quite a lot of good material on failure in human/technology systems and how to compensate for reality. I’d recommend the following as good starting points:
- The Logic of Failure, Dorner, Dietrich
- Normal Accidents, Perrow, Charles
- Success through Failure: The Paradox of Design, Petroski, Henry
- To Engineer Is Human: The Role of Failure in Successful Design, Petroski, Henry
- Beyond Fear: Thinking Sensibly About Security in an Uncertain World, Schneier, Bruce
4 thoughts on “Designing with failure in mind”
Comments are closed.