Read The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us Online
Authors: Christopher Chabris,Daniel Simons
Perhaps the most striking aspect of the illusion is how rarely we bother to do anything to probe the limits of our knowledge—especially considering how easy it is to do this. Before telling Leon Rozenblit that you know why the sky is blue, all you have to do is simulate the “why boy” game with yourself to see whether you actually know. We fall prey to the illusion because we simply do not recognize the need to question our own knowledge. According to Rozenblit,
In our day-to-day lives, do we stop and ask ourselves, “Do I know where the rain is coming from?” We probably don’t do it
without provocation, and it only happens in appropriate social and cognitive contexts: a five-year-old asks you, you’re having an argument with someone, you’re trying to write about it, you’re trying to teach a class about it.
And even when we do check our knowledge, we often mislead ourselves. We focus on those snippets of information that we do possess, or can easily obtain, but ignore all of the elements that are missing, leaving us with the impression that we understand everything we need to. The illusion is remarkably persistent. Even after completing an entire experiment with Rozenblit, repeatedly playing the “why boy” game, some subjects still did not spontaneously check their own knowledge before proclaiming that they would have done better with different objects: “If you had just asked me about the lock, I could have done that.”
Our tendency to make this error isn’t limited to our thoughts and beliefs about physical devices and systems. It happens whenever we have a big project to complete, a problem to solve, or an assignment to do. We must overcome the temptation to dive in and get started rather than examine our understanding of the task and its requirements. Avoiding this aspect of the illusion of knowledge was the key for Tim Roberts, who won the $25,000 top prize in the 2008 edition of a computer programming tournament called the TopCoder Open. He had six hours to write a program that met a set of written specifications. Unlike his competitors, Roberts spent the first hour studying the specs and asking questions—“at least 30”—of their author. Only after verifying that he completely understood the challenge did he start to code. He completed a program that did exactly what was required, and nothing more. But it worked, and it was finished on time. The time he spent escaping the illusion of knowledge was an investment that paid off handsomely in the end.
12
The illusion of knowledge makes us think we know how common objects work when we really don’t, but it is even more influential and
consequential when we reason about
complex systems
. Unlike a toilet or a bicycle, a complex system has many more interacting parts, and the system’s overall behavior cannot be easily determined just by knowing how its individual parts behave. Large-scale innovative engineering projects, like the construction of the iconic Sydney Opera House or the “Big Dig” in Boston, are classic examples of this sort of complexity.
The Big Dig was a project intended to reorganize the transportation network in downtown Boston.
13
In 1948, the Massachusetts government developed a plan to build new highways through and around the city in an attempt to address growing traffic volume on local roads. As part of this highway expansion, a thousand buildings were destroyed and twenty thousand residents were displaced to erect a two-level elevated highway cutting through downtown Boston. Although it was six lanes wide, the highway had too many on-and off-ramps and it was subject to chronic stop-and-go congestion for eight or more hours every day. It was also an eyesore. Disappointment with these results caused a companion project to be cancelled, further increasing the load on the elevated highway.
The main goals of the Big Dig, which entered the planning stage in 1982, were to move the downtown portion of the elevated highway underground and to build a new tunnel under Boston Harbor to connect the city to Logan International Airport. Several other roads and bridges were added or improved. In 1985, the entire operation was projected to cost $6 billion. Construction began in 1991, and by the time it was completed in 2006, the total cost was nearly $15 billion. Since much of the money was borrowed by issuing bonds, the ultimate cost by the time all loans are repaid will include an additional $7 billion in interest, resulting in a total expense more than 250 percent higher than originally planned.
The Big Dig’s cost grew for many reasons. One was the constant need to change plans as the project progressed. Officials considered stacking elevated highways one hundred feet high at one location in order to get traffic to where it needed to be; in the end that problem was solved by constructing a bridge that was the largest of its type ever built. Another
factor driving up costs was the need to develop new technologies and engineering methods to meet the challenges of submerging miles of highway in an area already dense with subway lines, railroad tracks, and buildings. But why weren’t these engineering complications foreseen? Everybody involved knew that the Big Dig was a public works effort of unprecedented size and complexity, but nobody realized, at least early on, that their estimates of the time and cost to complete it were little more than shots in the dark, and optimistic shots at that.
It is not as though this sort of underestimate had never happened before. The history of architecture is replete with examples of projects that turned out to be more difficult and costly than their designers—and the businessmen and politicians who launched them—ever expected. The Brooklyn Bridge, built between 1870 and 1883, cost twice as much as originally planned. The Sydney Opera House was commissioned by the Australian government in 1959 and designed by Danish architect Jørn Utzon over six months in his spare time. It was forecast in 1960 to cost 7 million Australian dollars. By the time it was finished, the bill came to AU$102 million. (Another AU$45 million will need to be spent to bring the building in line with aspects of Utzon’s original design that were not realized.) Antoni Gaudi began to direct the construction of the Sagrada Familia Church in Barcelona in 1883, and he said in 1886 that he could finish it in ten years. It is expected to be completed in 2026, a mere one hundred years after his death.
14
It is said that “the best-laid plans of mice and men often go awry” and that “no battle plan survives contact with the enemy.” Hofstadter’s law tells us: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.”
15
The fact that we need these aphorisms to remind us of the inherent difficulty of planning demonstrates the strength of the illusion of knowledge. The problem is not that our plans go awry—after all, the world is more complex than our simple mental models and, as Yogi Berra explained, “it’s tough to make predictions, especially about the future.”
16
Even expert project managers don’t get it right: They are more accurate than amateurs, but they are still wrong one-third of the time.
17
We all experience this sort of illusory knowledge,
even for simpler projects. We underestimate how long they will take or how much they will cost, because what seems simple and straightforward in our mind typically turns out to be more complex when our plans encounter reality. The problem is that we never learn to take this limitation into account. Over and over, the illusion of knowledge convinces us that we have a deep understanding of what a project will entail, when all we really have is a rough and optimistic guess based on shallow familiarity.
By now you may be sensing a pattern to the everyday illusions we have been discussing: They all tend to cast an overly favorable light on our mental capacities. There are no illusions of blindness, amnesia, idiocy, and cluelessness. Instead, everyday illusions tell us that we perceive and remember more than we do, that we’re all above average, and that we know more about the world and the future than is justified. Everyday illusions might be so persistent and pervasive in our thought patterns precisely because they lead us to think better of ourselves than we objectively should. Positive illusions can motivate us to get out of bed and optimistically take up challenges we might shrink from if we constantly had the truth about our minds in mind. If these illusions are in fact driven by a bias toward overly positive self-evaluation, then people who are less subject to this bias also should be less subject to everyday illusions. Indeed, people suffering from depression do tend to evaluate themselves more negatively and less optimistically, possibly resulting in a more accurate view of the relationship between themselves and the world.
18
A larger dose of realism in planning ought to help us make better decisions about how to allocate our time and resources. Since the illusion of knowledge is an inherent barrier to realism in any plans we draw up for our own use, how can we avoid it? The answer is simple to learn, but not so simple to execute, and it works only for the kinds of projects that have been done many times before—it works if you are writing a report, developing a piece of software, renovating your house, or even putting up a new office building, but not if you are planning a one-of-a-kind project like the Big Dig. Fortunately, most of the projects you do are not as unique as you may think they are. For us, planning this book was a unique and unprecedented task. But for a publisher trying to estimate how long it
would take us to write it, it was similar to all the other nonfiction, two-author, three-hundred-page books that have come out in the last few years.
To avoid the illusion of knowledge, start by admitting that your personal views of how expensive and time-consuming your own seemingly unique project will be are probably wrong. It can be hard to do this, because you truly do know much more about your own project than anyone else does, but this familiarity gives the false sense that only you understand it well enough to plan it out accurately. If instead you seek out similar projects that other people or organizations have already completed (the more similar to yours, the better, of course), you can use the actual time and cost of those projects to gauge how long yours will take. Taking such an “outside view” of what we normally keep inside our own minds dramatically changes the way we see our plans.
19
Even if you don’t have access to a database of renovation project timelines or software engineering case studies, you can ask other people to take a fresh look at your ideas and make their own forecast for the project. Not a forecast of how long it would take
them
to execute the ideas (since they too will likely underestimate their own time and costs), but of how long it will take
you
(or your contractors, employees, etc.) to do so. You can also imagine rolling your eyes as someone else excitedly tells you about their own plans to get a project like yours done. Such mental simulations can help you adopt an outside view. As a last resort, just calling to mind occasions when you were wildly optimistic (if you can be objective enough to recall them—we’ve all been foolish in this way more than once in our lives) can help you to reduce the illusion of knowledge that distorts your current predictions.
20
Thirty-two-year-old Brian Hunter was paid at least $75 million in 2005. His job was to trade futures contracts in energy, especially natural gas, for a Greenwich, Connecticut, hedge fund called Amaranth Advisors. His trading strategy involved placing bets on the future price
of gas by buying and selling options. In the summer of 2005, when gas was trading at $7–9 per million BTUs, he predicted that prices would rise considerably by early fall, so he loaded up on cheap options to buy at prices like $12 that seemed outrageously high to the market at the time. When hurricanes Katrina, Rita, and Wilma devastated oil platforms and processing plants along the coast of the Gulf of Mexico in late summer, prices went over $13. Suddenly, Hunter’s previously overpriced options were valuable. With trades like this he generated profits of more than $1 billion that year for Amaranth and its investors.
By August of the next year, Hunter and his colleagues had racked up gains of $2 billion. Gas prices had peaked at over $15 the previous December, post-Katrina, but were now in decline. Hunter again placed a huge bet that they would reverse course and rise again. Instead, prices plunged, falling below $5. In a single September week, Hunter’s trades lost $5 billion, approximately one-half of Amaranth’s total assets. After a total loss of approximately $6.5 billion, which at the time was the largest publicly disclosed trading loss in history, the fund was forced to liquidate.
What went wrong at Amaranth? Brian Hunter, and others at the firm, believed that they knew more about their world (the energy markets) than they actually did. Amaranth’s founder, Nick Maounis, thought that Hunter was “really, really good at taking controlled and measured risk.” But Hunter’s success was due at least as much to unpredictable events like hurricanes as to his understanding of the markets. Just before the blowup, Hunter himself even said, “Every time you think you know what these markets can do, something else happens.” But risk was apparently not being managed, and Hunter had not fully accounted for the unpredictability of the energy markets. He had actually made the same mistake earlier in his career at Deutsche Bank, blaming a one-week December 2003 loss of $51 million on “an unprecedented and unforeseeable run-up in gas prices.”
21
Throughout the history of financial markets, investors have formed theories to explain why some assets go up and others go down in value, and some writers have promoted simple strategies derived from these
models. The Dow theory, based on the late-nineteenth-century writings of
Wall Street Journal
founder Charles Dow, was premised on the idea that investors could tell whether an upswing in industrial stocks was likely to continue by looking for a similar upswing in transportation company shares. The “Nifty Fifty” theory of the 1960s and early 1970s claimed that the best growth would be achieved by fifty of the largest multinational corporations traded on the New York Stock Exchange, and those were therefore the best and—by virtue of their size—safest investments. The 1990s saw the “Dogs of the Dow” and the “Foolish Four”—models that advocated holding particular proportions of the stocks from the Dow Jones Industrial Average that paid the highest dividends as a percentage of their share prices.
22