Last week, the Bulletin of Atomic Scientists announced with great solemnity that it had moved the hands on its “Doomsday Clock” to ninety seconds before midnight, meaning humanity is now the closest it has ever been to The End.
As the kids say: whatever.
The Doomsday Clock was created in 1947 for the honourable purpose of drawing attention to the threat of nuclear war. It was not, and never has been, anything like a meaningful measure of existential risk. It is a very old PR tool that should have been retired long ago.
That said, the Bulletin’s little show did accomplish something I consider quite important: It gave me an excuse to write a post about one of the more obscure, fascinating, and important books I’ve ever come across.
Written by Lynn Eden, it’s called Whole World on Fire: Organizations, Knowledge, and Nuclear Weapons Devastation.
Yes, it is as light and entertaining as it sounds.
But if you can get past the academese and bureaucratese, the book explores something astonishing and consequential. It also contains a profound warning for anyone who runs, or works within, a large organization.
Picture the detonation of a nuclear weapon.
At the heart of a nuclear explosion, the temperature reaches several million degrees Celsius. This generates a heat flash that vaporizes everything within range, including water and soil, flesh and bone. The result is an immense fireball that expands and rises, becoming the infamous mushroom cloud.
There is also a blast, expanding outward with unimaginable speed and force. It smashes everything in its path. Buildings are shattered and swept away like dust.
But this isn’t the end of the destruction. The fireball not only incinerates everything within it, it instantly sets fire to everything around it. Vast quantities of oxygen are consumed, causing hurricane-force winds to race inward toward the fires, providing more fuel and whipping the flames into a fury. The resulting firestorm is catastrophic in its own right.
In sum, nuclear weapons annihilate cities with blast and fire. You know this. Everyone knows. (Or at least everyone in my generation does...)
And yet, incredibly, as documented in painful detail by Lynn Eden, for more than half a century, the United States military ignored the fire damage inflicted by nuclear weapons.
When the military calculated how much damage a particular weapon would do to a particular target, it carefully calculated the blast it would deliver. It ignored fire. It did the same when it scaled up and calculated how many nuclear weapons it needed to ensure the destruction of the Soviet military in the event of a full-scale nuclear war. Blast damage was carefully calculated. Fire damage was ignored.
It didn’t do this once or twice. It did it consistently, for decades.
The first time I heard this, I was incredulous.
Not only was this mind-blowingly foolish. It was fantastically expensive.
Nuclear weapons were and are enormously costly to build and maintain. By omitting fire damage, the US military drastically underestimated the destructiveness of its weapons. Thus, when it calculated that it needed a certain level of destructiveness, it asked for and got far more weapons than it needed. That took money away from other defence priorities. And the federal budget generally.
“Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed,” Dwight Eisenhower famously said in his final address as president. Leaving fire damage out of the calculations meant a lot more theft was required.
Worse still, the military’s blindspot was dangerous.
Merely by existing, a nuclear weapon is a risk. Whether by mishandling or malfunction, it could accidentally detonate. It could be detonated by sabotage. It could be detonated by mentally unstable personnel. Use your imagination. No matter how carefully handled a nuclear weapon may be, there are a thousand and one ways something could go wrong.
As Eric Schlosser demonstrated in the 2013 book Command and Control, the US military has had brushes with all sorts of nightmares. Likely the closest it came to an accidental detonation was in 1961, when a B-52 carrying two nuclear bombs crashed in North Carolina. After the crew ejected, the bombs separated from the free-falling plane. The government later assured Americans none of the bombs had been at risk of detonation, but that was a lie. On one of the bombs, three of four required triggering mechanisms had activated. (We don’t know the full Soviet record but there is no reason to think it was any better, and we have more than a few good reasons to think it was worse.)
Each nuclear weapon in existence — then and now — poses at least some risk of disaster. For the sake of argument, let us assume that risk is very small. That’s reassuring, right? But the more warheads there are, the greater the risk.
If there are 100 nuclear weapons, the risk is Very Small x 100.
With 1,000 weapons, the risks is Very Small x 1,000.
At its peak, the US stockpile contained more than 30,000 nukes.
That risk was Very Small x 30,000.
Which is a lot less reassuring, isn’t it?
So let’s lay bare the logic:
The failure of the US military to take fire damage into account when it calculated destructiveness caused it to drastically underestimate the destructiveness of its nuclear weapons. Which led it to ask for, and receive, far more nuclear weapons than it needed.
Which wasted immense sums of money. And boosted the risk of catastrophe.
That catastrophe never happened.
But don’t let that fool you. It is, as Schlosser put it in his subtitle, “the illusion of safety.” If you bet your life playing craps and you roll a seven, it doesn’t mean you were not in grave danger. It means you got lucky.
Editorial aside: This is also why the massive decline in the global stockpile of nuclear weapons at the end of the Cold War, and in the years following, is one the most important stories of progress in the past half-century. And why efforts to keep reducing those stockpiles must continue.
Now comes the killer question: How?
With so many smart people involved and so much at stake, how did the US military make such an obvious mistake? And not once or twice. But year after year. For more than half a century.
How in hell did that happen?
Eden’s answer is powerful — and widely applicable — because it is so prosaic.
She starts all the way back in the 1920s.
The First World War evolved aircraft from rickety motorized kites to effective killing machines, particularly with the development of heavy bombers by the war’s end. In the 1920s and 1930s, militaries everywhere assessed the new state of war and concluded that aerial bombardment would play a central role in future conflicts.
Many theorists came to believe that massive bombing of civilian populations was coming because it would be devastatingly effective — as civilians would psychologically break under the strain and cities would collapse into chaos.
But the US military (the Army Air Corps, then the US Air Force after 1947) focused mostly on so-called “precision bombing” (in reality, it would take another sixty years for precision bombing to cease to be an oxymoron) that would deliver high explosives to chosen targets, weakening the enemy’s ability and will to fight. What mattered in this strategy was blast damage. Put enough blast on the right targets and you win. Incendiary bombs and fire damage weren’t precise. So they had no role. And the military ignored them.
This “blast damage frame” shaped how the military saw situations and planned operations. “During World War II,” Eden writes, “mathematicians, structural engineers, and operations analysts working within the blast damage frame greatly increased organizational capabilities to predict such damage.”
And yet World War II brought grisly new experience with fire damage. The US military’s own bombing campaign against Japan (fire-bombing in Europe was a British specialty) eventually turned toward incendiary devices, which were terrifyingly effective in destroying cities largely made of wood.
Then came Fat Man and Little Boy.
The atomic bombs dropped on Hiroshima and Nagasaki unmistakably inflicted horrific damage with both blast and fire. One would think this would transform the military’s thinking.
But it didn’t. The “blast damage” frame was too strong.
Predicting blast damage was what the military knew how to do. So it kept doing that. The military did not know how to predict fire damage. So it didn’t.
Thus, organizational determination to predict blast damage during World War II led to the organizational capacity to do so, which provided the basis for building more capacity after the war. In contrast, lack of attention to the prediction of fire damage led to the allocation of fewer resources. Even less attention after the war resulted in a total incapacity to predict nuclear fire damage: no recognized experts, no manuals, no knowledge-laden organizational routines.
As the Cold War developed, this state of knowledge was not seen as a product of flawed human organization. No, it was reality. The fact that the military couldn’t predict fire damage meant fire damage couldn’t be predicted and didn’t need to be predicted. It wasn’t important. Blast damage was important.
The military was now locked into a self-reinforcing loop. The very fact that it knew how to predict bomb damage but not fire damage caused it to pour more resources into predicting blast damage but not fire damage. Which further strengthened its perception of reality. Which caused it to do more of the same.
It only took a little critical thought to realize a huge piece of the puzzle was missing — and this would badly skew the numbers, waste huge amounts of money, and put the world in greater danger. But the military got organizationally locked into this mindset thanks to a sequence of events that began more than two decades before nuclear weapons were even invented.
That’s the unsettling power of path dependence.
Eden’s prose is academic in the good and bad senses of the word but in discussing the implications of her research for large organizations — military or civilian — she came up with a sentence as snappy as anything in Harvard Business Review.
“Organizations should think about what they are not thinking about,” she writes.
Indeed. If there’s one thing I learned reading Whole World On Fire, it’s that even the largest, most sophisticated organizations can become blind to the seemingly obvious. And remain blind for decades.
If I were responsible for a large organization — happily for me, my team consists of me and my dog — I would try, every now and then, to assume that my organization is blind to something important.
What is it? What should we see that we are not seeing?
Or to put that in Eden’s terms, now and then, I would think about what we are not thinking about.
Really interesting. Thanks.
I've got a slightly more cynical take here: there were people within the organisation who were aware of the impact of not modelling fire damage, and as you point out, knew that it would lead to less resources being devotes to their particular sector.
If you interpret the US military's history post WW2 through the lens of inter services rivalry for resources, it explains a lot.