I do enjoy your critiques of the doomsayers, which does not undermine them entirely, but just likes to point at the often missing limb. Perhaps they need the word "polycrisis," because looking at the world we see every day, much doesn't seem immediately so bad, and the individual crises alone are not able to summon enough fear for these prophets.
I have some sympathy for that view. I suspect that thinking about reality as being dominated by complex, non-linear, entangled systems is far from being as common it should be in governments and major corporate c-suites. "Polycrisis" isn't a bad way of getting across that perspective. It just ain't new.
I sometimes think that what makes these times unique is the prevalence of unchallenged wild irrationality across all sectors of society -- including supposedly objective ones such as science. Forceful assertions are routinely made and policies put into place with minuscule supporting evidence and sometimes massive contradictory evidence.
There's also a tendency to label every single problem a "crisis". The notion of a "polycrisis" is just the same idea with grade inflation: it's not just a crisis or two; it's a crisis crisis!
But then I think back to the late 1960s and how wildly irrational those times were as well. Somehow we survived as a culture and more-or-less regained our sanity. I hope history repeats itself in that regard.
I would like to imagine that it’s possible to arrive at some sort of middle ground; it seems to me that chalking all apprehension up to hindsight bias and doom-mongering ignores the historical reality that some periods were indeed more unstable and precarious than others, even though no period was ever free of uncertainty. It’s true that going solely by the generalizations employed in public pronouncements we may comfortably situate them in any period, which is why I would’ve liked to see more focus on actual research and empirics here as opposed to just dissecting the speculations & ruminations of talking heads.
This is truly excellent but/and it impels me to consider several things I had not pondered sufficiently before.
in particular, a certain Dan Gardner has taught me to think about how we should measure the factors that we wish to make predictions or judgements about, so that we can look back on that thinking at some time in the future and know whether we were smart, wise or merely deluded, and most of all, why.
I totally agree that complexity, non-linearity and entanglement have been with us for some time, as you say. I also believe that our perceptions of these factors (as distinct from their reality) have intensified, as modern technologies have expanded our awareness of developments located somewhere around the globe or perhaps, buried deep in a scientist’s lab - as well as of their potential interconnections.
But is it possible that these root causes of ‘polycrisis’ have grown or escalated in recent years and may be doing so at an increasingly rapid pace? If this IS possible, how would we measure that exactly? Or if we believe the opposite (that complexity is declining or stable) how would we measure that?
It is also possible that complexity and its crisis cousins are addressable up to a certain point but beyond that notional line could simply overwhelm us, at least without our applying far better ways of understanding and solving them. Again, if possibly true, how would we define that line and measure our closeness to it?
Dan, I need to thank you sincerely (again) for introducing me to the intriguing writings of Ethan Mollick, (in oneusefulthing.com).
I feel sure you will have already dug into his latest missive on the potential impacts of ChatGPT etc. which he subtitles: “A sudden increase in AI capability suggests a weirder world in our near future.” I think we could reasonably substitute his word “weirder” for non-linear and unpredictable.
He also makes the case that we have been in a long period of rather slow gains from technological change but that AI will cause those impacts to skyrocket far and more disruptively: “the Singularity - (refers) to a moment where our technology (often specifically AI technology) accelerates suddenly and irrevocably. A moment where every graph of technological progress becomes vertical.”
Should we agree with his prediction? If so, could this event push us across that line-of-no-return I mentioned above?
I am far from a doomsayer on the forces of turbulence now present, or on our horizon. Indeed, I especially appreciate the postings of Roser, Ritchie & company from Our World in Data about all the ways humankind is progressing.
However, I am also intently focussed on what we (societies, communities, corporations, individuals) need to do to effectively prepare ‘for whatever unfolds’.
You use the evocative verb “brace” to describe how we must get ready. I would really value your expanding on what you include under that heading and where, in future, we can find the resources it suggests. As a historian, you will likely have many insights to share with us that we can draw from both our past and present!
Thanks. Morrey. You raise a smart and essential point which I debated discussing in this piece but ultimately decided not to because...it's complicated.
In short: Is complexity (in the technical sense) increasing? If so, is it increasingly so rapidly that we are in a world that is materially different than the world of, say, the 1980s and 1970s? And will the world of 2030 or 2040 (if we continue on the same trajectory) be materially different again?
My answer to the first question is simple: Yes. More and faster communication links and more globalization = more complexity. But my answer to the second is No. And my answer to the third question is a more qualified/less confident No. For reasons that would require a long essay. Or a short book. Which is why I decided against going down this rabbit hole.
If ever I've read one of your pieces that provides food for thought, this is it. I've constructed dozens of strategic plans/business plans in various capacities in business, government, and the non-profit sector - always linear (although I didn't consciously think of them that way). While consideration of potential risks and how they may be alleviated is a necessary part of every such plan, I had never thought of looking at them through the lens of "complex, non-linear, entangled systems", and at least at this moment, cannot conceive of how one might do that. If I understand what you are saying correctly, it might be analogous to multivariate analysis, and to do something like that would (I think) require a degree in advanced mathematics.
I feel like it comes down to "do the best you can with what you've got". But hey, having in mind one of your previous posts, maybe AI can do that for us - not that I have, or am ever likely to have, access to AI to do that.
Layperson’s thought here: Faithfully applying that lens would mean infusing strategic plans with more humility and delineation of a wider possibility space. I wonder at what point the client would find that counter to what they thought they were paying for though…
I do enjoy your critiques of the doomsayers, which does not undermine them entirely, but just likes to point at the often missing limb. Perhaps they need the word "polycrisis," because looking at the world we see every day, much doesn't seem immediately so bad, and the individual crises alone are not able to summon enough fear for these prophets.
I have some sympathy for that view. I suspect that thinking about reality as being dominated by complex, non-linear, entangled systems is far from being as common it should be in governments and major corporate c-suites. "Polycrisis" isn't a bad way of getting across that perspective. It just ain't new.
A two armed response, chapeau
I sometimes think that what makes these times unique is the prevalence of unchallenged wild irrationality across all sectors of society -- including supposedly objective ones such as science. Forceful assertions are routinely made and policies put into place with minuscule supporting evidence and sometimes massive contradictory evidence.
There's also a tendency to label every single problem a "crisis". The notion of a "polycrisis" is just the same idea with grade inflation: it's not just a crisis or two; it's a crisis crisis!
But then I think back to the late 1960s and how wildly irrational those times were as well. Somehow we survived as a culture and more-or-less regained our sanity. I hope history repeats itself in that regard.
I would like to imagine that it’s possible to arrive at some sort of middle ground; it seems to me that chalking all apprehension up to hindsight bias and doom-mongering ignores the historical reality that some periods were indeed more unstable and precarious than others, even though no period was ever free of uncertainty. It’s true that going solely by the generalizations employed in public pronouncements we may comfortably situate them in any period, which is why I would’ve liked to see more focus on actual research and empirics here as opposed to just dissecting the speculations & ruminations of talking heads.
This is truly excellent but/and it impels me to consider several things I had not pondered sufficiently before.
in particular, a certain Dan Gardner has taught me to think about how we should measure the factors that we wish to make predictions or judgements about, so that we can look back on that thinking at some time in the future and know whether we were smart, wise or merely deluded, and most of all, why.
I totally agree that complexity, non-linearity and entanglement have been with us for some time, as you say. I also believe that our perceptions of these factors (as distinct from their reality) have intensified, as modern technologies have expanded our awareness of developments located somewhere around the globe or perhaps, buried deep in a scientist’s lab - as well as of their potential interconnections.
But is it possible that these root causes of ‘polycrisis’ have grown or escalated in recent years and may be doing so at an increasingly rapid pace? If this IS possible, how would we measure that exactly? Or if we believe the opposite (that complexity is declining or stable) how would we measure that?
It is also possible that complexity and its crisis cousins are addressable up to a certain point but beyond that notional line could simply overwhelm us, at least without our applying far better ways of understanding and solving them. Again, if possibly true, how would we define that line and measure our closeness to it?
Dan, I need to thank you sincerely (again) for introducing me to the intriguing writings of Ethan Mollick, (in oneusefulthing.com).
I feel sure you will have already dug into his latest missive on the potential impacts of ChatGPT etc. which he subtitles: “A sudden increase in AI capability suggests a weirder world in our near future.” I think we could reasonably substitute his word “weirder” for non-linear and unpredictable.
He also makes the case that we have been in a long period of rather slow gains from technological change but that AI will cause those impacts to skyrocket far and more disruptively: “the Singularity - (refers) to a moment where our technology (often specifically AI technology) accelerates suddenly and irrevocably. A moment where every graph of technological progress becomes vertical.”
Should we agree with his prediction? If so, could this event push us across that line-of-no-return I mentioned above?
I am far from a doomsayer on the forces of turbulence now present, or on our horizon. Indeed, I especially appreciate the postings of Roser, Ritchie & company from Our World in Data about all the ways humankind is progressing.
However, I am also intently focussed on what we (societies, communities, corporations, individuals) need to do to effectively prepare ‘for whatever unfolds’.
You use the evocative verb “brace” to describe how we must get ready. I would really value your expanding on what you include under that heading and where, in future, we can find the resources it suggests. As a historian, you will likely have many insights to share with us that we can draw from both our past and present!
Thanks. Morrey. You raise a smart and essential point which I debated discussing in this piece but ultimately decided not to because...it's complicated.
In short: Is complexity (in the technical sense) increasing? If so, is it increasingly so rapidly that we are in a world that is materially different than the world of, say, the 1980s and 1970s? And will the world of 2030 or 2040 (if we continue on the same trajectory) be materially different again?
My answer to the first question is simple: Yes. More and faster communication links and more globalization = more complexity. But my answer to the second is No. And my answer to the third question is a more qualified/less confident No. For reasons that would require a long essay. Or a short book. Which is why I decided against going down this rabbit hole.
If ever I've read one of your pieces that provides food for thought, this is it. I've constructed dozens of strategic plans/business plans in various capacities in business, government, and the non-profit sector - always linear (although I didn't consciously think of them that way). While consideration of potential risks and how they may be alleviated is a necessary part of every such plan, I had never thought of looking at them through the lens of "complex, non-linear, entangled systems", and at least at this moment, cannot conceive of how one might do that. If I understand what you are saying correctly, it might be analogous to multivariate analysis, and to do something like that would (I think) require a degree in advanced mathematics.
I feel like it comes down to "do the best you can with what you've got". But hey, having in mind one of your previous posts, maybe AI can do that for us - not that I have, or am ever likely to have, access to AI to do that.
Layperson’s thought here: Faithfully applying that lens would mean infusing strategic plans with more humility and delineation of a wider possibility space. I wonder at what point the client would find that counter to what they thought they were paying for though…
Quite right. And -- safe prediction -- clients would hate it.
Listening to this reminded me of what you’re discussing here and another plea for epistemic humility https://www.volts.wtf/p/on-the-abuse-and-proper-use-of-climate