Is globalization no longer fit for purpose, if it ever was? That was the discomforting question that I raise in my post two weeks ago. The globalized economic system is increasingly complex and tight coupled, which makes it prone to unanticipated failures that can quickly cascade into catastrophe. Access to essential products and industrial inputs can be lost to disruptions brought on by new wars, pandemic outbreaks, and other world events. The rapidity of travel combined with massive factory farms turns the emergence of deadly viruses into an undesirable, but perhaps unavoidable, product of modernity.
Is crisis a normal feature of the world’s economic system? Is our only option to scale back globalization, by onshoring key manufacturing abilities and infrastructures and by voluntarily slowing the pace of trade and travel?
Managing Complexity
Turning to less complex technological designs, or installing further technical safeguards, are not the only options for perilously sophisticated techno-human systems. Around the same time that Charles Perrow’s work on “normal “accidents was being published, scholars at the University of California and the University of Michigan began to more carefully study the behavior of organizations in charge of risky technologies: air traffic control, nuclear power plants, and aircraft carriers.
Political scientist Todd LaPorte, organizational psychologist Karl Weick, and others found that some institutions displayed awe-inspiring levels of safety and performance, what they called “high reliability organizations.” Such operations shared a number of features: they took errors seriously and constantly strove to better understand what caused them; their structures were flexible and allowed improvisation when necessary; they permitted even the lowest ranking participants to contribute to decisions and even halt activities if they sensed danger; and they fostered a culture of constant communication and buy-in to the overall mission of safe operation.
Research on high reliability organizations suggests that we need not be as pessimistic as Perrow, who would recommend that we abandon rather than try to manage complex and tightly coupled technologies—at least when alternatives are available. But implementing and maintaining a high reliability culture is not as straightforward as it might seem. Even the scholars who study them warn that they are expensive, rare, and at constant risk of decay.
Think of how the remarkable reduction in major commercial aviation disasters was upset by cultural decline within Boeing. The company’s fateful mistakes with the 737 MAX were precipitated by the compartmentalization of decision-making, which resulted in a culture where errors were concealed and financial concerns dominated. And the FAA for its part seemed to have also become complacent in light of their success, allowing Boeing to “self-certify” design changes that merited independent scrutiny.
Although the system for assuring the safety of passenger flight seemed to have quickly self-corrected in response to two major 737 MAX crashes, albeit with signs of chronic problems at Boeing, the point still remains that momentary lapses in vigilance can have considerable consequences. The cost of these lapses in fields such as airline travel is typically the lives of a few hundred passengers, but for pandemics or the potential implosion of the world’s silicon chip supply—as with nuclear reactors—the cost could be intolerably high. One possible solution, it would seem, is to following the footsteps of the high reliability organizations to develop an overarching global organization to ensure that the large-scale technological system that is the globalized economy is managed properly to contain possible pandemics.
Between Control and Democracy
But before we consider that, we need to think more deeply about the potential political consequences of trying to control complex sociotechnical systems. Writing in 1980, political theorist Langdon Winner argued that technologies had politics. Examining automatic tomato harvesters and highway overpasses, he pointed out that industrial machinery tended to drive economic centralization and that urban constructions had the power to discriminate against impoverished minority groups. But his more central question was whether certain technologies, simply by design, required particular political arrangements.
Karl Marx’s frequent writing partner, Friedrich Engels, pointed out that a ship at sea only runs well when ruled by a captain. This observation led Winner to wonder if technologies like nuclear energy were inherently authoritarian. In his essay he focused on the risks that would arise should the nuclear industry turn to recycling plutonium as uranium reserves declined, but his reasoning applies even without that condition. Given both the risk of catastrophic accident and the dire consequences should radioactive material surreptitiously leave the premises and fall into the wrong hands, only a captain, an absolute and centralized authority, could practically manage the risks. Protections of civil liberties and democratic dissent would stand in the way of needed surveillance, frustrating the level of control necessary to prevent disaster.
But when we compare studies of high reliability organizations, they only partly confirm Winner’s intuition. At certain moments these organizations function as quasi-democracies: everyone’s safety concerns are taken seriously, no centralized authority barks out orders to be followed unquestioningly, and even the lowest ranking person can stop operations to avoid an accident.
But undergirding this quasi-democratic process is an absolutist commitment. High reliability operations on an aircraft carrier are partly assured by the military being a “total institution.” The mission and community of people enacting it become participants’ entire universe; their level of dedication and vigilance regarding the cause must be unreserved. In civilian high reliability organizations, buy-in is achieved by careful selection of employees. Arrogant or complacent workers are quickly dismissed. Although the organization itself may be built upon a tolerance for constant questioning and bottom-up organizing, participants’ faithfulness to the underlying process must be unwavering. Complex and potentially catastrophic technologies set firm boundaries on democracy’s reach.
Do Globalized Societies Demand a Globalized Order?
Safeguarding global supply chains as well as the preventing the emergence of globetrotting pandemics would seem to demand a high reliability organization on a planetary scale. In order to avert potential catastrophe, the ticket to participation may need to be strict adherence to a range of performance standards.
Rising to the organizational challenges of a globalized economic system may entail large tradeoffs in terms of democracy and national sovereignty.
This wouldn’t just encompass monitoring and regulation of factory farms, air travel, and the duplication of key industries across the globe but also an international organization to ensure that ports and other transport systems maintained an appropriate level of slack. In this view, assuring the performance and stability of the global economic system would require a whole series of far-reaching policies, imposed across the world. Avoiding rare but potentially calamitous disruptions demands total commitment to an exacting safety culture.
This leaves us at a dilemma. Following globalization pessimism means forgoing the many efficiencies of the global market. Rising to the organizational challenges of a globalized economic system may entail large tradeoffs in terms of democracy and national sovereignty. Is there a third path? Is it really a decision between plague and cholera (as the Germans put it)? I’ll explore a potential alternative model in the next installment of this series.
One of the most interesting books I’ve read is built around an incredibly simple premise. I’m thinking about the Checklist Manifesto written by surgeon Atul Gawande. He was looking for a way to improve patient outcomes after surgery, a complex and potentially high risk activity. He got the idea for using a checklist from airline pilots and went through a series of adaptations to get it right. A key component was more democratic control by nurses, who are given room to speak up to against autocratic surgeons.
I wonder if the relevant trade off isn’t one of more process and reporting vs increased authority and control.
Of course you mention the important of low level contributions, but even they have to be held to strict performance standards and alignment with safety goals.
Another analogy strikes me, that of waterfall vs agile software development. Top down rigid specifications create rigidity and inability to adapt. Agile is named that for a reason. Then again a motto of “Move fast and break things” doesn’t sound very safety conscious!