In the first episode of the new season of Stranger Things, there is a scene that is pure joy for so many of us 50-something, one-time high school nerds.
It’s 1986. Mike, Dustin, Erica, and some other teens are playing an intense round of Dungeons & Dragons, a role-playing game popular with unpopular kids.
In the game, the players encounter the terrifying monster Vecna and battle ensues. It goes badly. Vecna is bloodied but only two players remain and each has one attack left. The dungeon master — the game’s narrator and referee — tells them the probability of either landing a blow strong enough to finish the monster is one in twenty. They should run.
They do not run.
Dustin takes the legendary twenty-sided die in hand. To an observer of Dungeons & Dragons who hasn’t played the game, the dungeon master appears to be Fate, the one deciding who lives and who dies. Not so. The dice are Fate.
Dustin rolls the die.
It bounces and tumbles and comes up … eleven. Dustin dies.
Erica is the last player standing. She rolls.
In slow motion, the die spins, twists, and tips one last time, coming up … twenty.
Vecna is dead.
They all go nuts.
Anyone who has played hours and hours — and hours — of Dungeons & Dragons knows exactly how they feel. Because if you spend that much time playing Dungeons and Dragons, it is all but certain that you will encounter many one-in-twenty situations. Some will present a small chance of disaster. Others, as in this episode, will offer a small chance of glory.
In almost all these instances, the small probability will not come to pass. You will roll a five. Or a sixteen. Or an eleven.
But if you play hours and hours — and hours — of Dungeons & Dragons, and if you encounter many small probabilities, and if you roll the twenty-sided die many times, you will roll a twenty. Guaranteed.
It’s basic probability: Most small-probability events will not come to pass — by definition — but any small-probability event can come to pass. And if you repeatedly encounter small-probability events, it’s just a matter of time before one will.
Whether disaster or glory, when that happens in the context of a game in which you have invested so much time and passion, it’s a powerful experience. You feel it. And you don’t forget.
I recount all this not from a desire to wallow in nostalgia. (Or not only that.) I recount it because there’s an important insight here.
Yes, really. An important insight in Dungeons & Dragons.
In fact, I will go so far as to suggest that if you really understand this insight, and learn to make practical use of it, it could significantly improve the quality of your decisions, and thus the quality of your life.
Are ye intrigued, wandering adventurers? Then light a torch and let us enter the dungeon of probability….
In Superforecasting, Phil Tetlock and I wrote about how people routinely struggle to understand and use probability. By that, we didn’t mean sophisticated probability theory. We meant extremely simple concepts.
Here’s one: If I say there is an eighty percent chance of something happening, how likely is it that this thing will not happen?
You rolled your eyes, didn’t you? Nothing could be more obvious. The answer is twenty percent. If you possess even the slightest numeracy, you know that. It’s ridiculous to even ask the question.
But that’s true when we think about probability in the abstract. What happens when we are faced with real issues and we need to use probability? Then, we routinely do not rise to even that level of sophistication.
Instead, we tend to think in binary terms: Something will happen or it won’t. Or if circumstances force us to admit we can’t be definitive, we might say “maybe.” This is what Phil and I called the “three-setting mental dial.” It will / it won’t / maybe. That’s our default mode of thinking.
The confusion caused by the three setting mental dial is pervasive. Robert Rubin, the former treasury secretary [under Bill Clinton], told me how and his then-deputy Larry Summers would often be frustrated when they briefed policymakers in the White House and Congress because people would treat an eighty percent probability that something would happen as a certainty that it would. "You almost had to pound the table, to say ‘yes there's a high probability but this might not happen,’ Ruben said. “But the way people think they seem to translate a high probability into this will happen.”
These are smart, educated, successful people. (For the most part. No jokes about Congress, please.) And they’re making decisions about serious matters of public policy. If you said to them, in a meeting, “now please remember that when I say there is an eighty percent probability that something will happen, I am also saying there is a twenty percent probability it won’t,” they would be insulted. And yet you must say that — perhaps a little more tactfully — because many of them will, in fact, treat an eighty percent probability as if it were a certainty.
Rubin also noted that this didn’t always happen. If the probability cited was close to fifty percent, people understood that the outcome may or may not happen. “If you say something is sixty/forty, people kind of get the idea,” he said. But all too often, these smart, educated, successful people dealing with important matters treated higher probabilities as certain. And lower probabilities as impossible.
It will / it won’t / maybe. That’s the three-setting mental dial at work.
This mistake is extremely common. Even David Leonhardt, the very smart and accomplished New York Times journalist who routinely works with data, who once wrote a column warning readers about exactly this mistake … once made exactly this mistake.
So what happens when we encounter extreme probabilities? Like ninety-five or ninety-nine percent? Or five percent or one percent? Then it’s absurdly easy to fall into the trap. And that is dangerous.
In November, 2016, shortly before the president election, poll aggregators and forecasters were unanimous: Hillary Clinton would probably win the election. Most said she had an eighty percent chance of winning. Ninety percent. One or two suggested the probability was ninety-five percent or higher. Nate Silver, the famous poll aggregator and forecaster, put Clinton’s odds considerably lower, at around seventy percent, but that still made her the heavy favourite.
I don’t need to say what the result of the 2016 presidential election was.
The reaction to this shock was illuminating: There was near-universal agreement that the pollsters and forecasters had been horribly wrong. Even Nate Silver, who was far less bullish on Clinton, was widely mocked and condemned.
But why? Silver’s seventy percent for Clinton was a thirty percent for Trump.
Maybe Trump just got lucky.
The same was true of all the other probabilities. Eighty percent chance of a Clinton win? That means twenty percent chance of Trump winning. That’s one in five.
Ninety percent chance of a Clinton win? That’s ten percent for Trump.
Even a ninety-five percent chance of Clinton winning gives Trump a five percent chance.
Five percent is one in twenty. That was the probability of Erica rolling a twenty. And as every fifty-something, one-time high school nerd who played Dungeons & Dragons for hours and hours knows all too well … sometimes the twenty-sided die comes up twenty.
So why was everyone so sure that the pollsters and forecasters were wrong? Why did almost no one even consider the possibility that Trump simply got lucky?
Because there are no dice in life.
When you play Dungeons & Dragons, and the dramatic moment comes, and you watch Fate tumble down the table, it’s impossible not to see there is luck at work. Roll the die at one moment, get one result. Roll it at another, get a different result. Change almost anything — the thickness of the table cloth, the sweatiness of your palm, anything — and instead of crowing about your victory over Vecna, the monster feasts on your flesh. It’s all out of your control. All you can do is roll.
But what happens when a shock like the election of Donald Trump occurs?
Then we explain.
Well, you see, it is perfectly clear that the polls were skewed, forecasters underestimated rural alienation, and the … blah blah blah. We concoct post-facto stories that provide pat explanations for what just happened, a process that is such a natural and inherent part of human cognition that it can be almost automatic and effortless.
And our stories do more than merely explain. They reveal — now that we think of it — that what happened was much more likely to happen than those other people thought. Why, in fact, it was likely to happen. Or even a sure thing! And if we keep talking long enough, we may convince ourselves that we, in fact, expected it to happen.
This is a well-demonstrated facet of human psychology. It’s usually called “hindsight bias” but it has a more evocative name: It is the “knew-it-all-along phenomenon.”
As a result, when an extremely unlikely event happens, we don’t continue to see it as an extremely unlikely event. So we aren’t smacked in the face with the fact that sometimes a roll of the twenty-sided die can come up twenty. We don’t learn that crucial lesson.
And we continue to treat improbable events as impossible events. Which is dangerous.
Consider pandemics. New or rare viruses pop up unexpectedly now and then. The great majority don’t take off and become highly fatal pandemics, so the probability that any one such eruption will turn the world upside down is very small. And if you treat that very small probability as an impossibility, you will do something like not pay for pandemic insurance, or let stockpiles of pandemic supplies go out date, or close the office that monitors these things to provide early warning. Why pay for something you don’t need?
You’ll be fine doing that. In fact, you’ll save money! Clearly, you made the right call. Or so you feel.
Until the die comes up twenty.
Or consider again the election of 2016. Here’s something I wrote a week after that shocking night:
In the last couple of days of the election, the decline in Clinton’s poll numbers had been arrested and most analysts gave her between a 70% and 95% chance of winning. And yet the tone suggested more than that. Many, many observers noted the probabilities and then said words to the effect of “she has it in the bag.” In other words, they explicitly or implicitly acknowledged that the election was not a sure thing but proceeded to talk as if it were. You could see that in the shock after Trump won: So many people had treated a high probability as a certainty. Or to put that another way, they treated a low probability as an impossibility. That is not a small mistake. As I reminded folks on Twitter shortly before the election, if you play Russian roulette there is only a 16.66% chance of dying but that does not mean you should relax and pull the trigger.
We now know that Clinton lost because Democratic turnout was way down from the previous election. That was particularly true among black voters in some of the key states that swung the election. So here’s my hypothesis: Democratic supporters may have heard the probabilistic forecasts but there was no one “pounding the table” to make them understand that an 80% chance of a Clinton win means a 20% chance Trump would win. On the contrary. The tone of pundits assured them that 80% meant it was certain Clinton would win. So some didn’t bother to vote.
Was that enough to make a difference to the outcome? I don’t know. But given the margins involved, it seems to me quite plausible that the answer is yes.
Low probability is not zero probability and when the consequences of that low probability coming to pass are great, it should not be ignored: However many chambers a revolver has, it must be handled with care if even one contains a bullet.
So how can we fix this? Simple. Play lots of Dungeons & Dragons.
Yes, really. Get invested in the story. Care about the character you play. Eventually, you will encounter low probability events and you will roll the twenty-sided die. Chances are it will go the way it did for Dustin.
But eventually, if you keep at it, you will roll a twenty like Erica. Then you will be see in unmistakeable and undeniable form that improbable does not impossible — and you will get an emotional wallop that will make the lesson deeply memorable.
You will experience probability. And that is much more profound than anything a teacher or an author can tell you.
This is why, on that long day and evening in 2016, when everyone said Hillary Clinton had it in the bag, I did not relax. I believed the forecasts were roughly correct. I still believe that now. But I did not relax.
I felt about Donald Trump how the kids on Stranger Things feel about Vecna, so I was on edge, muscles tense, stomach in a knot. And nothing anyone said could relax me.
Even if every forecaster on the planet had been unanimous that Clinton had a ninety-five percent chance of winning, I would not have treated it as a done deal because I learned long ago — I experienced long ago — that ninety-five percent “it will happen” means five percent “it won’t.” And sometimes the die comes up twenty.
That’s not knowledge. It’s not an intellectual thing. It’s something I feel. Thanks to, yes, playing that silly game decades ago.
Now, if you’re too cool for Dungeons & Dragons — most people are — you can use any other dice-driven game that really grabs you.
In this article, British science journalist Tom Chivers made a similar point about learning from Warhammer, a game in which strange little figurines are moved about a board and they live or die on dice rolls. It’s not my cup of tea. (I’d mock Tom but I can hardly do that now.) But the lessons about probability are the same. (I should also note that Tom is a wonderfully sensible source of commentary on science and reasoning and how not to go mad in this world. Follow him on Twitter or check out his books on AI and statistics in the news.)
You can also bet on craps or roulette. If you put enough money on the line you can be sure there will be plenty of emotional wallop to make the lessons sink in.
But Dungeons & Dragons is a lot cheaper.
Maybe my thinking is too philosophically deterministic, but is it fair to think of probability for some of these major events not as: there’s an 80% this happens and a 20% it doesn’t, but rather: based on our best information, we’re 80% confident this will happen, and therefore there’s a 20% we’re wrong. I mean -- by election day 2016, the die was set and there was no longer a 70% chance of a Clinton win actually happening -- we just didn’t have all the information necessary to know that and based on the info we *did* have there was reason to believe with 70% conifdence Clinton would win.
(Perhaps what I’m saying is painfully obvious, or I’m out to lunch, but something rubs me the wrong way in saying there’s some percentage chance of something happening or not at a moment in time, where it’s too late for an actual course change for a different result, we just don’t know the result yet at that moment)
I once played a game of DnD where all I needed was a 2 or higher to succeed, and I rolled a 1.