The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us (26 page)

Paradoxically, properly inferring causation depends on an element of randomness. Each person must be assigned randomly to one of the two groups—otherwise, any differences between the groups could be due to other systematic biases. Let’s say you just asked people to report whether they listen to music while working and you found that people who worked in silence tended to be more productive. Many factors could cause this difference. Perhaps people who are better educated prefer working in silence, or perhaps people with attention deficits are more likely to listen to music.

A standard principle taught in introductory psychology classes is that correlation does not imply causation. This principle needs to be taught because it runs counter to the illusion of cause. It is particularly hard to internalize, and in the abstract, knowing the principle does little to immunize us against the error. Fortunately, we have a simple trick to help you spot the illusion in action: When you hear or read about an association between two factors, think about whether people could have been assigned randomly to conditions for one of them. If it would have been impossible, too expensive, or ethically dubious to randomly assign people to those groups, then the study could not have been an experiment and the causal inference is not supported. To illustrate this idea, here are some examples taken from actual news headlines:
10

  • “Drop That BlackBerry! Multitasking May Be Harmful”—Could researchers randomly assign some people to lead a multitasking, BlackBerry-addicted life and others to just focus on one thing at a time all day long? Probably not. The study actually used a questionnaire to find people who already tended to watch TV, text-message, and use their computers simultaneously, and compared them with people who tended to do just one of these things at a time. Then they gave a set of cognitive tests to both groups and found that the multitaskers did worse on some of the tests. The original article describes the study’s method clearly, but the headline added an unwarranted causal interpretation. It’s also possible that people who do badly at the cognitive tests also think they can multitask just fine, and therefore tend to do it more than they should.

  • “Bullying Harms Kids’ Mental Health”—Could a researcher randomly assign some kids to be bullied and others not to be bullied? No—not ethically, anyway. So the study must have measured an association between being bullied and suffering mental health problems. The causal relationship could well be reversed—children who have mental health issues might be more likely to get bullied. Or some other factors, perhaps in
    their family background, could cause them both to be bullied and to have mental health issues.

  • “Does Your Neighborhood Cause Schizophrenia?”—This study showed that rates of schizophrenia were greater in some neighborhoods than others. Could the researchers have randomly assigned people to live in different neighborhoods? In our experience people generally like to participate in psychology experiments, but requiring them to pack up and move might be asking too much.

  • “Housework Cuts Breast Cancer Risk”—We doubt experimenters would have much luck randomly assigning some women to a “more housework” condition and others to a “less housework” condition (though some of the subjects might be happy with their luck).

  • “Sexual Lyrics Prompt Teens to Have Sex”—Were some teens randomly assigned to listen to sexually explicit lyrics and others to listen to more innocuous lyrics, and then observed to see how much sex they had? Perhaps an adventurous experimenter could do this in the lab, but that’s not what these researchers did. And it’s doubtful that exposing teens to the music of Eminem and Prince in a lab would cause a measurable change in their sexual behavior even if such an experiment were conducted.

Once you apply this trick, you can see the humor in most of these misleading headlines. In most of these cases, the researchers likely knew the limits of their studies, understood that correlation does not imply causation, and used the right logic and terminology in their scientific papers. But when their research was “translated” for popular consumption, the illusion of cause took over and these subtleties were lost. News reporting often gets the causation wrong in an attempt to make the claim more interesting or the narrative more convincing. It’s far less exciting to say that those teens who listen to sexually explicit lyrics also
happen to have sex at earlier ages. That more precise phrasing leaves open the plausible alternatives—that having sex or being interested in sex makes teens more receptive to sexually explicit lyrics, or that some other factor contributes to both sexual precocity and a preference for sexually explicit lyrics.

And Then What Happened?

The illusory perception of causes from correlations is closely tied to the appeal of stories. When we hear that teens are listening to sexually explicit music or playing violent games, we expect there to be consequences, and when we hear that those same teens are
subsequently
more likely to have sex or to be violent, we perceive a causal link. We immediately believe we understand how these behaviors are causally linked, but our understanding is based on a logical fallacy. The third major mechanism driving the illusion of cause comes from the way in which we interpret narratives. In chronologies or mere sequences of happenings, we assume that the earlier events must have caused the later ones.

David Foster Wallace, the celebrated author of the novel
Infinite Jest
, committed suicide by hanging himself in the late summer of 2008. Like many famous creative writers, he suffered for a long time from depression and substance abuse, and he had attempted suicide before. Wallace was something of a literary prodigy, publishing his first novel,
The Broom of the System
, at the age of twenty-five while he was still studying for his master of fine arts (MFA) degree. The book was praised by the
New York Times
, but received mixed reviews elsewhere. Wallace worked on a follow-up short story collection, but could not help feeling like a failure. His mother brought him back to live at home. According to a profile in the
New Yorker
by D. T. Max,
11
things went downhill quickly:

One night, he and Amy [his sister] watched “The Karen Carpenter Story,” a maudlin TV movie about the singer, who died of a heart attack brought on by anorexia. When it was over, Wallace’s
sister, who was working on her own M.F.A., at the University of Virginia, told David that she had to drive back to Virginia. David asked her not to go. After she went, he tried to commit suicide with pills.

What do you make of this passage about Wallace’s earlier suicide attempt? To us, the most natural interpretation is that the movie upset Wallace, that he wanted his sister to stay with him but she refused, and that in despair over losing her companionship, he overdosed. But if you read the passage again, you will see that none of these facts are stated explicitly. Strictly speaking, even the idea that he wanted her to stay is only implied by the sentence, “David asked her not to go.” Max is almost clinically sparing in his just-the-facts approach. But the interpretation we attach to these facts seems obvious; we come to it automatically and without conscious thought, indeed without even realizing that we are adding in information that is not present in the source. This is the illusion of cause at work. When a series of facts is narrated, we fill in the gaps to create a causal sequence: Event 1 caused Event 2, which caused Event 3, and so on. The movie made Wallace sad, which made him ask Amy to stay; she went, so she must have refused him, causing him to attempt suicide.

In addition to automatically inferring cause when it is only implied by a sequence, we also tend to remember a narrative better when we have to draw such inferences than when we don’t. Consider the following pairs of sentences, taken from a study by University of Denver psychologist Janice Keenan and her colleagues:
12

  1. Joey’s big brother punched him again and again. The next day his body was covered by bruises.

  2. Joey’s crazy mother became furiously angry with him. The next day his body was covered by bruises.

In the first case, no inference is needed—the cause of Joey’s bruising is stated explicitly in the first sentence. In the second case, the cause of the bruises is implied but not stated. For this reason, understanding the second pair of sentences turns out to be slightly harder (and takes
slightly longer) than understanding the first. But what you’re doing as you read the sentences is crucial. To understand the second pair of sentences, you must make an extra logical inference that you don’t need in order to make sense of the first pair. And in making this inference, you form a richer and more elaborate memory for what you’ve read. Readers of the
New Yorker
story likely will remember the implied cause of Wallace’s early suicide attempt, even though it never was stated in the story itself. They will do so because they drew the inference themselves rather than having it handed to them.

“Tell me a story,” children beg their parents. “And then what happened?” they ask if they hear a pause. Adults spend billions of dollars on movies, television, novels, short stories, works of biography and history, and other forms of narrative. One appeal of spectator sports is their chronology; every play, every shot, every home run is a new event in a story whose ending is in doubt. Teachers—and authors of books on science—are learning that stories are effective ways to grab and control an audience’s attention.
13
But there is a paradox here: Stories—that is, sequences of events—are by themselves entertaining, but not directly useful. It’s hard to see why evolution would have designed our brains to prefer receiving facts in chronological order unless there was some other benefit to be gained from that type of presentation. Unlike a specific story, a general rule about what causes what can be extremely valuable. Knowing that your brother ate a piece of fruit with dark spots on it and then vomited encourages you to infer causation (by food poisoning), a piece of knowledge that can help you in a wide variety of future situations. So we may delight in narrative precisely because we compulsively assume causation when all we have is chronological order, and it’s the causation, not the sequence of events, that our brains are really designed to crave and use.

In the next paragraph of his David Foster Wallace profile, D. T. Max tells us that after recovering from his suicide attempt, “Wallace had decided that writing was not worth the risk to his mental health. He applied and was accepted as a graduate student in philosophy at Harvard.” Again, the causation is implied: It was Wallace’s fear of depression and suicide that drove him—perhaps ironically—to graduate
study in philosophy. But what are we to conclude about how he went about it? One possibility is that he applied to Harvard, and only to Harvard. A much more common practice is to apply to a wide variety of graduate programs and to see which ones admit you. Applying just to Harvard is the act of someone who is either expressing supreme confidence or setting himself up to fail (or both); applying broadly is the act of someone who just wants to pursue his interests at the best school he can get into. The different actions signal different personalities and approaches to life.

It seems to us that Max is implying that Wallace applied
only
to Harvard, because if he had applied to other schools, that fact would have been relevant for our interpretation of Wallace’s behavior, so the author would have mentioned it. We automatically make the assumption, when reading statements like this one, that we have been given all of the information we need, and that the most straightforward causal interpretation is also the correct one. Max’s words don’t explicitly say that Wallace applied only to Harvard; they just lead us, without our awareness, into concluding that he did.

The mind apparently prefers to make these extra leaps of logic over being explicitly told the reasons for everything. This may be one reason why the timeworn advice “show, don’t tell” is so valuable to creative writers seeking to make their prose more compelling. The illusion of narrative can indeed be a powerful tool for authors and speakers. By arranging purely factual statements in different orders, or by omitting or inserting relevant information, they can control what inferences their audiences will make, without explicitly arguing for and defending those inferences themselves. D. T. Max, whether deliberately or not, creates the impression that Wallace’s suicide attempt was precipitated by his sister’s possibly callous refusal to stay with him, and that Wallace chose to apply only to Harvard for graduate school. When you know about the contribution of narrative to the illusion of cause, you can read his words differently, and see that none of these conclusions are necessarily correct. (Tip: Listen carefully for when politicians and advertisers use this technique!)

“I Want to Buy Your Rock”

A conversation between Homer and Lisa in an episode of
The Simpsons
provides one of the best illustrations of the dangers of turning a temporal association into a causal explanation.
14
After a bear is spotted in Springfield, the town initiates an official Bear Patrol, complete with helicopters and trucks with sirens, to make sure no bears are in town.

HOMER:
Ahhh … not a bear in sight. The bear patrol must be working like a charm.

LISA:
That’s specious reasoning, Dad.

HOMER:
Thank you, honey.

LISA
(picking up a rock from the ground): By your logic, I could claim that this rock keeps tigers away.

HOMER:
Ooooh … how does it work?

LISA:
It doesn’t work—it’s just a stupid rock. But I don’t see any tigers around here, do you?

HOMER:
Lisa, I want to buy your rock.

Other books

The Witch by Jean Thompson
Losing It by Sandy McKay
Criss Cross by Evie Rhodes
Final Quest by B. C. Harris
His to Protect by Elena Aitken
Crossroads by Stephen Kenson


readsbookonline.com Copyright 2016 - 2024