One of the most interesting books I’ve read is built around an incredibly simple premise. I’m thinking about the Checklist Manifesto written by surgeon Atul Gawande. He was looking for a way to improve patient outcomes after surgery, a complex and potentially high risk activity. He got the idea for using a checklist from airline pilots and went through a series of adaptations to get it right. A key component was more democratic control by nurses, who are given room to speak up to against autocratic surgeons.
I wonder if the relevant trade off isn’t one of more process and reporting vs increased authority and control.
Of course you mention the important of low level contributions, but even they have to be held to strict performance standards and alignment with safety goals.
Another analogy strikes me, that of waterfall vs agile software development. Top down rigid specifications create rigidity and inability to adapt. Agile is named that for a reason. Then again a motto of “Move fast and break things” doesn’t sound very safety conscious!
Great example! I read Gawande's book years ago, but I forgot about the case of the nurses. It's a good point. There's a third school of thought on safety that I'll introduce, which is a bit more democratic, once I get around to fleshing out it's application to globalization. I should know more about agile than I do. I'll be curious if you see any parallels when part three gets put up.
Another question occurs to me, and that’s the distinction between resilience and safety. Supply chain disruption is one kind of risk. Pandemics would be somewhat different. Looking forward to the next one in the series!
That's a good question. For the cases of complex, tightly coupled systems, the terms are closely related, because they have the property where seeming unrelated incidents can snowball into a larger disaster or crisis. Assuring safety is to try to shore up a system lacking in inherent resilience. At least that's how I think Perrow would see it.
And at least terms of his theory, I think the risks are similar, though perhaps not in degree. Their respective systems are increasing tightly coupled, allowing small errors to propagate quickly. There is the added complexity of a virus being a living, mutating thing, but supplies chains aren't immune to the sudden arrival of emergent disruptive behaviors (though they're very arguably easier to respond to before a full blown crisis).
One of the most interesting books I’ve read is built around an incredibly simple premise. I’m thinking about the Checklist Manifesto written by surgeon Atul Gawande. He was looking for a way to improve patient outcomes after surgery, a complex and potentially high risk activity. He got the idea for using a checklist from airline pilots and went through a series of adaptations to get it right. A key component was more democratic control by nurses, who are given room to speak up to against autocratic surgeons.
I wonder if the relevant trade off isn’t one of more process and reporting vs increased authority and control.
Of course you mention the important of low level contributions, but even they have to be held to strict performance standards and alignment with safety goals.
Another analogy strikes me, that of waterfall vs agile software development. Top down rigid specifications create rigidity and inability to adapt. Agile is named that for a reason. Then again a motto of “Move fast and break things” doesn’t sound very safety conscious!
Great example! I read Gawande's book years ago, but I forgot about the case of the nurses. It's a good point. There's a third school of thought on safety that I'll introduce, which is a bit more democratic, once I get around to fleshing out it's application to globalization. I should know more about agile than I do. I'll be curious if you see any parallels when part three gets put up.
Another question occurs to me, and that’s the distinction between resilience and safety. Supply chain disruption is one kind of risk. Pandemics would be somewhat different. Looking forward to the next one in the series!
That's a good question. For the cases of complex, tightly coupled systems, the terms are closely related, because they have the property where seeming unrelated incidents can snowball into a larger disaster or crisis. Assuring safety is to try to shore up a system lacking in inherent resilience. At least that's how I think Perrow would see it.
And at least terms of his theory, I think the risks are similar, though perhaps not in degree. Their respective systems are increasing tightly coupled, allowing small errors to propagate quickly. There is the added complexity of a virus being a living, mutating thing, but supplies chains aren't immune to the sudden arrival of emergent disruptive behaviors (though they're very arguably easier to respond to before a full blown crisis).