I've got a slightly more cynical take here: there were people within the organisation who were aware of the impact of not modelling fire damage, and as you point out, knew that it would lead to less resources being devotes to their particular sector.
If you interpret the US military's history post WW2 through the lens of inter services rivalry for resources, it explains a lot.
A relatively minor mistake, compared to thinking that any nuclear war with any damage, would be like any previous war, that anything but full nuclear disarmament was sane.
Hell, it's a mistake to think that war makes any sense, any more. You just don't profit by "taking over" a nation any more, as "profit" all comes from the hard work of educated people using intact industrial infrastructure - the whole war business model assumes you can pay for it by controlling the work of peasant farmers, which was true until a hundred-odd years ago. The US has just spent 20 years proving that the most-powerful nation can't profit in any way from taking over vastly weaker ones, though one was the weakest on earth, and the other had vast oil wealth. Both cost trillions.
And yet, you give the Pentagon another $850B, spend in other departments to round it up to a trillion - $3000/person. Beats me.
All these civilizations work so very hard on achieving failure. Make us all wonder. Ever more land for every more resources to feed a system that kills us all with extreme prejudice. Progress? Advanced primates?
This is thought-provoking - and a superb articulation of a wicked problem I have struggled to explain. I was a huge fan of Superforecasting from the outset. Yet I am ambivalent about placing too much hope in it, without additional capabilities being added to it.
If we are too often path dependent, if we can’t see what we can’t see (like the fish swimming in a bowl and not knowing it is in water), what will cause us to apply even SUPER forecasting to the right dimension? In your example, applying it to fire damage?
So the question that fascinates me now is how organizations can surface their path dependencies. What techniques can they apply to unpack their tacit assumptions?
I have some hope for some kinds of scenario work but for a thorny knot like this, we made a whole portfolio of methods.
And then, once we have surfaced these paths, how do we ensure that the built-in cultural reinforcements of those dependencies get challenged and overcome?
Thanks so much for writing about nuclear weapons. The degree to which our entire culture is not thinking about nuclear weapons is perhaps the best argument for why we need to slow down the knowledge explosion. We simply aren't rational enough to safely manage ever more, ever larger powers, delivered at an accelerating pace.
And yet they built the Japanese and German villages at Dugway specifically to determine the best way to burn out large areas of civilian housing. (Japanese just need incendiaries, Germans you want HE in the first wave to blow open the tiled roofs and open up the structures then follow with incendiaries.) It also belies Douhet's dictum that you want to blow down the buildings with HE, set them alight with incendiaries, then drop gas to kill the firefighters - an aim atomic bombs carry out in one neat package.
This reminds me of that study about the five monkeys with the ladder, which I just googled to find the study, and discovered that it's not even true. Can't believe it! my CTO talks about it every day and there's a ted talk on it... Scandalous
It turns out in the real study, the monkeys instead do actually learn to ignore the entrenched practise, so just like in this tale of eventually breaking with old mindsets, I guess there is hope for us all (it just may take a while sometimes).
Really interesting. Thanks.
I've got a slightly more cynical take here: there were people within the organisation who were aware of the impact of not modelling fire damage, and as you point out, knew that it would lead to less resources being devotes to their particular sector.
If you interpret the US military's history post WW2 through the lens of inter services rivalry for resources, it explains a lot.
A relatively minor mistake, compared to thinking that any nuclear war with any damage, would be like any previous war, that anything but full nuclear disarmament was sane.
Hell, it's a mistake to think that war makes any sense, any more. You just don't profit by "taking over" a nation any more, as "profit" all comes from the hard work of educated people using intact industrial infrastructure - the whole war business model assumes you can pay for it by controlling the work of peasant farmers, which was true until a hundred-odd years ago. The US has just spent 20 years proving that the most-powerful nation can't profit in any way from taking over vastly weaker ones, though one was the weakest on earth, and the other had vast oil wealth. Both cost trillions.
And yet, you give the Pentagon another $850B, spend in other departments to round it up to a trillion - $3000/person. Beats me.
All these civilizations work so very hard on achieving failure. Make us all wonder. Ever more land for every more resources to feed a system that kills us all with extreme prejudice. Progress? Advanced primates?
Great article. For those who haven't read it, I highly recommend Schlosser's Command and Control.
This is thought-provoking - and a superb articulation of a wicked problem I have struggled to explain. I was a huge fan of Superforecasting from the outset. Yet I am ambivalent about placing too much hope in it, without additional capabilities being added to it.
If we are too often path dependent, if we can’t see what we can’t see (like the fish swimming in a bowl and not knowing it is in water), what will cause us to apply even SUPER forecasting to the right dimension? In your example, applying it to fire damage?
So the question that fascinates me now is how organizations can surface their path dependencies. What techniques can they apply to unpack their tacit assumptions?
I have some hope for some kinds of scenario work but for a thorny knot like this, we made a whole portfolio of methods.
And then, once we have surfaced these paths, how do we ensure that the built-in cultural reinforcements of those dependencies get challenged and overcome?
…we need a whole portfolio…
Thanks so much for writing about nuclear weapons. The degree to which our entire culture is not thinking about nuclear weapons is perhaps the best argument for why we need to slow down the knowledge explosion. We simply aren't rational enough to safely manage ever more, ever larger powers, delivered at an accelerating pace.
And yet they built the Japanese and German villages at Dugway specifically to determine the best way to burn out large areas of civilian housing. (Japanese just need incendiaries, Germans you want HE in the first wave to blow open the tiled roofs and open up the structures then follow with incendiaries.) It also belies Douhet's dictum that you want to blow down the buildings with HE, set them alight with incendiaries, then drop gas to kill the firefighters - an aim atomic bombs carry out in one neat package.
This reminds me of that study about the five monkeys with the ladder, which I just googled to find the study, and discovered that it's not even true. Can't believe it! my CTO talks about it every day and there's a ted talk on it... Scandalous
It turns out in the real study, the monkeys instead do actually learn to ignore the entrenched practise, so just like in this tale of eventually breaking with old mindsets, I guess there is hope for us all (it just may take a while sometimes).