In How Big Things Get Done, the first advice Bent Flyvbjerg and I offer anyone thinking of tackling a big, ambitious, complex project is simple.
Slow down.
For various reasons — psychological, political, cultural — people want to get big projects going as soon as possible. Put shovels in the ground. Measurable progress. Push, push, push. Unfortunately, this approach is as bad as it is common — because it means planning will not be nearly as detailed and rigorous as it needs to be. Problems aren’t recognized. Solutions aren’t found. And overlooked problems don’t vanish. Sooner or later, they surface, and because the project is already in delivery, they risk doing real damage, particularly when problems bump up against problems, like cars colliding on an icy highway. This is how a project that starts in a sprint turns into an agonizing crawl that will finish desperately late and horribly over budget.
So we urge readers to “think slow, act fast” by putting a heavy emphasis on developing a detailed, tested, reliable plan before doing anything else. That takes a lot of time and effort. But it boosts the probability of a smooth and swift delivery, and it’s in the delivery that costs can explode and schedules go up in flames.
As they say in the military: “Slow is smooth. Smooth is fast.”
Much of the book explains how to “think slow, act fast.” Today, I want to talk about the crucial first step, but also connect it to some fundamental psychology — because that first step is applicable far beyond big projects. As you’ll see, it applies to essentially any situation involving important decisions in complex environments. And there’s a simple way to implement it.
So what’s the first step, as discussed in How Big Things Get Done?
If you’re already at the point where you’re contemplating a big project, you almost certainly will have a sense of what the project is, how it should be done, and why you’re doing it. In fact, that will all seem obvious. Too obvious to waste time discussing. You want to get going.
If someone proposes building a bridge to connect a populated island with the mainland, for example, a predictable conversation will follow. How much will the bridge cost? Who should pay? Where should the bridge be built? What about environmental impacts? And so on. If the good that we hope the bridge will do is discussed at all, that discussion will be quick and superficial. It’s obvious, after all. Why waste time?
In the book, Bent and I urge people to set all that aside. Instead, for the first step, you need to ask the most fundamental question.
We don’t build bridges for the sake of building bridges. We build bridges because we expect them to do good things for people. In this particular case, what are those good things? Why should we do this project?
The master of this approach is Frank Gehry. When clients come to him, they usually think they know what they want and they ask Gehry to do it. But rather than simply saying yes or no, Gehry asks questions about the client and the project that boil down to, “why are you doing this project?” And he isn’t satisfied with glib answers. He asks probing questions, exploring who the client is, what the client needs, and most of all, what the client truly wants out of the project.
As a result, Gehry often comes up with ideas that achieve the client’s goals better than what the client walked in with. That’s what happened in Bilbao, Spain. The client asked Gehry to renovate an old building. Rather than say yes or no, Gehry questioned the client closely about what they hoped to get out of the project — and proposed a radically different project that he thought would deliver better. The result was the Guggenheim Bilbao. Today, it is one of the most famous and acclaimed buildings in modern history. But more importantly, it not only delivered the clients’ goals, it far surpassed the clients’ wildest dreams. That never would have happened if Gehry hadn’t started by setting aside assumptions and asking fundamental questions
Now let’s go back to that bridge again.
From How Big Things Get Done:
Picture politicians who want to connect an island to the mainland. How much would a bridge cost? Where should it be located? How long would it take to build? If they discuss all this in detail, they are likely to feel they have done excellent planning when, in fact, they started with an answer—a bridge is the best solution—and proceeded from there. If they instead explored why they want to connect the island to the mainland—to reduce commuting time, to increase tourism, to provide faster access to emergency healthcare, whatever—they would focus first on the ends and only then shift to discussing the means for achieving those ends, which would be the right order of things. That’s how new ideas surface: What about a tunnel? Ferries? A helipad? There are lots of ways to connect an island with the mainland and meet a need. Depending on the goal, it may not even have to be a physical connection. Excellent broadband service may do what is required, and more, at a fraction of the cost. “Connecting” the island might not even be necessary or advisable. If access to emergency healthcare is the concern, for example, the best option may be installing that service on the island. But none of this will come to light if the discussion starts with an answer.
I suspect that, laid out like this, our advice seems simple and irrefutable. And obvious. But there’s the puzzle. If this is obviously the sensible way to proceed, why do projects so often not start with exploratory questioning of the project’s ultimate goals?
A major cause is WYSIATI.
Daniel Kahneman’s gloriously awkward acronym WYSIATI (“wiz-ee-at-ee”) stands for What You See Is All There Is. WYSIATI simply means that whatever we happen to know at this moment feels like all the relevant information there is. This feeling is essential for intuitive judgement to function because the essence of intuition is speed and effortlessness. We simply look and — SNAP! — we feel the answer. If we instead felt a nagging sense that we need to gather more information, we would hesitate. And hesitation was a good way to be eaten by a lion in the long era when being eaten by lions was a top concern for our species.
Today, WYSIATI may still help keep you alive in some circumstances. If you see a shadow moving toward you in an alley, and you feel fear, you should listen to that fear and not, say, pull out your phone and Google crime statistics. WYSIATI and the intuitive judgement it empowers can also work astonishingly well if you are highly experienced in some task involving predictable patterns. Every elite athlete is proof of that. So are the master builders discussed in How Big Things Get Done. And lastly, WYSIATI works well enough for the countless mundane decisions that aren’t important enough to warrant a serious expenditure of brain power: If you see a flavour of ice cream that appeals to you, it’s perfectly reasonable to order it without continuing down the long list of flavours.
But when making important decisions in this complex, modern world, WYSIATI can cause us to make highly consequential mistakes.
Last March, in the first post of this newsletter, I wrote about a classic book called Thinking In Time, written by Richard Neustadt and Ernest May. Neustadt was a Harvard political scientist who also worked for three decades as an advisor to presidents and White House staff. May was a renowned historian who also advised top officials. They had a view of high-level decision-making second to none. And they identified one mistake that was as common as it was dangerous: When people are confronted with a new situation, they do not investigate the situation and learn more. Instead, they assume that they understand the situation well enough to start making decisions, so they discuss the question, “what should we do?”
That is WYSIATI at work.
Neustadt and May illustrated with a striking story from July, 1979, when rumours surfaced in Washington that a Soviet brigade had been stationed in Cuba. Intelligence confirmed it. The Soviets were back in Cuba!
Officials furiously debated how to respond. The news leaked. Hawks talked tough in public. Doves counselled diplomacy. It started to feel like a frightening sequel to the Cuban Missile Crisis was underway.
But as the crisis unfolded, intelligence officers quietly investigated. They discovered something rather important.
The Soviets had not put the brigade in Cuba recently. Or even in the past year or two. In fact, the brigade had been stationed in Cuba since the days of the Cuban Missile Crisis, with the knowledge and acceptance of the United States. American officials had simply forgotten.
As I wrote last March:
The embarrassed Americans explained everything to the Soviet ambassador. He was stunned. “Do you expect me to get people in the Kremlin to believe this story?” the ambassador asked.
“In Moscow, apparently, no one did,” wrote May and Neustadt. “There was much speculation about what Carter’s motives had been.”
And with that, an especially ridiculous chapter in the history of the Cold War concluded.
Neustadt and May noted that the White House could have avoided this fiasco, and many others like it, by adopting a new standard operating procedure: When confronted with a novel situation, never start by discussing how you should respond. Instead, always ask the same question: “What’s the situation?”
What are we seeing? Do others see something else? How did we get here? What led up to what we are seeing? What information are we missing? Talk it out. Explore and learn. Much of what is said in this exploration will only confirm what is already obvious. But it may occasionally surface surprising information that changes the character of the problem. If White House officials had applied it in 1979, they would have quickly discovered that their assumptions about the Soviet soldiers in Cuba were false. And the apparent “crisis” would have vanished.
Years after I read Thinking In Time, I interviewed Paul Van Riper, a retired three-star general who was, and is, a legend in the US Marine Corps. In addition to being a highly decorated combat veteran, Van Riper was the Marine Corps’ in-house intellectual in the 1980s and 1990s, when he studied complexity theory and oversaw the rewriting of Marine Corps doctrine. I wrote about how Van Riper solved a seemingly unsolvable problem in this piece.
One of Van Riper’s central lessons for military officers? Slow down. “People want to immediately get into solving the problem before they really understand it,” he told me.
Faced with a major problem, Van Riper brought officers together and told them they could not talk about what they should do about the problem. Instead, they were to talk only about the circumstances they faced. Or to put that in Neustadt and May’s terms, Van Riper asked his officers to explore the question, “what’s the situation?”
All of this will sound familiar to product designers.
Before good designers make any design decisions, they set aside their existing knowledge and assumptions and instead carefully explore — like anthropologists studying a newly discovered band of hunter-gatherers — what people are doing and why they are doing it. Only after do they turn their minds to how they can help. (I will have more about this in an upcoming look at the work of designers at Philips, the Dutch medical technology giant.)
So think about this. Product designers realized that drawing conclusions prematurely on the basis of an assumed understanding of the problem is a trap. The solution, they concluded, is to set aside existing thoughts and spend time in an open-minded exploration of the situation.
A top military general drew the same conclusion about officers at war.
Two top White House advisors spotted the same mistake and urged the same solution.
And here come Bent and I saying much the same about big projects.
These are very different people operating in very different fields. And yet, we all see the same behaviour — people rushing forward and making decisions on the basis of assumptions. And the same solution — replacing the plunge ahead with a good look around.
There are lots more illustrations in lots more fields. “Theory of change” program design, used by philanthopies and governments. Jeff Bezos and his “working backwards” process at Amazon. Some of Peter Drucker’s writings in the 1950s. Even — this is one I only recently came across — Henry Ford in the glorious early days of the Ford Motor Company.
I’ll elaborate on some of these processes in future. But for now I just want to underscore the basic point.
There is a general human tendency to intuitively size up situations and feel confident that we know enough to make decisions. Sometimes, that’s a fine way to proceed. But for big calls, it’s a big mistake.
Assume there’s more you need to know. Look around. Explore. Ask “what’s the situation?” with genuine curiosity and an open mind.
And most of all, slow down.
Postscript: If you’re a paid subscriber and you’d like a signed copy of How Big Things Get Done, watch your email. I’ll pick several names randomly and send emails to winners soon.
I appreciate the lessons set out in this article and look forward to reading your book to have the ideas fleshed out more fully. I wonder if the book also considers the risk of over-planning. I recognize the strong tendency you describe to jump in to solve a problem immediately without fully scoping out what our real goals might be and thus, in essence, moving forward to do more of what we have always done. However, I have also seen instances where projects get overwhelmed by a need to collect yet more data, or to look at more factors which objectively have little bearing in the ultimate success of the project. I suppose that this problem could relate back as well to the very point you have highlighted - namely that the project leads did not spend enough time at the outset determining what their real objectives and measures of success were.
Dan -- I'm listening to the book now. It's great! My wife is about to start and I am recommending it to the post-docs and other trainees in our program.