Read Unthinkable: Who Survives When Disaster Strikes - and Why Online

Authors: Amanda Ripley

Tags: #Non-Fiction, #Sociology, #Psychology, #Science, #Self Help, #Adult, #History

Unthinkable: Who Survives When Disaster Strikes - and Why (9 page)

Today Mileti lives in the California desert. He is retired from his longtime teaching post at the University of Colorado at Boulder, but he’s still complaining to anyone who will listen. “In this great, highly educated, affluent country, we do not have adequate warning systems,” Mileti says. “We should have more than luck. We can have more than luck. We’ve been studying warnings for half a century, and we have it nailed.”

Like a lot of disaster researchers, Mileti is perpetually disappointed. Luckily, he also has a sense of humor. After he says something particularly provocative, he laughs with a loud bark, showing off unnaturally white, straight teeth. When he is asked to give speeches, which is often, he sometimes shows up in a Hawaiian shirt. Then he unleashes sweeping condemnations and calls to action. For all these reasons, in the small and sometimes tedious world of disaster research, Mileti has something of a cult following.

In July of 2006, at the annual disaster summit held at the University of Colorado at Boulder, Mileti appeared at a panel titled “Risk-Wise Behavior.” The auditorium was packed with 440 disaster experts. Mileti, who spoke last, was the only one without a PowerPoint presentation. He just got up and started ranting. “How many people do you need to see pounding through their roofs before we tell them how high the floodwaters can be, how hard the ground can shake? How many citizens must die to get us to do it?” he nearly shouted. “If you can’t create the political will, do it anyway.” The crowd went crazy.

As a smoker, Mileti likes to point out that the nation does take
some
risks seriously: “Do you know how many no-smoking signs you see in an airport? We’ve just not chosen to do the same thing for natural disasters,” he said. “Why can’t we put up signs that say, ‘This is a tsunami inundation zone’ [along the coast] of California? If we’re not doing it for other hazards, I say take the no-smoking signs out of the airport.”

Later, over hamburgers next to the Boulder Creek, Mileti rattled off other counterexamples: “You know how everyone knows not to take an elevator in a fire? How did that happen? In Hawaii, it’s now part of the culture to get to high ground if you feel an earthquake. It should be the same in Santa Monica. You need to acculturate a tsunami warning system.” Like most people at the workshop, Mileti was heartbroken by Hurricane Katrina—a catastrophe that did not have to happen. Unlike some of the younger attendees, Mileti fully expects to be heartbroken again. “We know exactly—exactly—where the major disasters will occur,” he says, smiling. “But individuals underperceive risk. The public totally discounts low-probability, high-consequence events. The individual says, it’s not going to be this plane, this bus, this time.”

We still measure risk with the ancient slide rule that worked for most of our evolutionary history, even though we have calculators at our side. Likewise, we still eat chocolate cake even though we no longer need to hoard calories. But we can learn to eat less cake, and it is possible to become better judges of risk.

So how do we override our worst instincts? First and most important, the people in charge of warning us should treat us with respect. It’s surprising how rarely warnings explain
why
you should do something, not just
what
you should do. Once you start noticing this problem, you’ll see it everywhere. In fact, I think that the mistakes the public makes in calculating risk are primarily due to this pervasive lack of trust on behalf of the people charged with protecting us. They are our escorts through Extremistan, but they don’t level with us often enough.

For example, you have heard flight attendants explain how to put on an oxygen mask, should it drop down from the ceiling of the plane. “Secure your own mask before helping others,” the warning goes. But the flight attendant does not tell you
why
. Imagine if you were told that, in the event of a rapid decompression, you would only have ten to fifteen seconds before you lost consciousness. Aha. Then you might understand
why
you should put your mask on before you help your child. You might understand that if you don’t put your mask on first, you’ll both be unconscious before you can say, “how does this thing work?” Suddenly the warning would not just sound like a nagging legalese; it would sound like common sense. It would motivate.

In the late 1990s, the U.S. government conducted a large and priceless survey of 457 passengers involved in serious airplane evacuations. Over half of them said that they had not watched the entire preflight safety briefing because they had seen it before. Of those who did watch most of the briefing, 50 percent said it had not been helpful to them when the emergency came to pass. In retrospect, they wished they had been told more about exit routes, how to use the slides, and how to get off the wing after fleeing through the overwing exit. They wanted a more vivid, practical warning than they got.

Carry-on bags are a major problem in plane crashes. About half of all passengers try to take their carry-on with them in an evacuation, even though they have been ordered by flight attendants to leave everything behind. (This is the same gathering behavior exhibited by Elia Zedeño in the World Trade Center, when she felt compelled to take things, including a mystery novel, before she left her office.) Later, plane-crash survivors report that these collected carry-on bags posed a major obstacle to getting out quickly and safely. People tripped on them as they groped through the darkness, and the bags became weapons as they hurtled down the evacuation slides. The solution to this problem may not be that complicated, however. In a recent study in the United Kingdom, one volunteer suggested that flight attendants, instead of asking passengers to “leave all hand baggage behind,” tell passengers
why
they should do so. They should simply say this, the volunteer suggested: “Taking luggage will cost lives.”

Why don’t the airlines give people better warnings, even when plane-crash survivors tell them how to do it? For one thing, they are in business. They don’t want to scare customers by talking too vividly about crashes. Better to keep the language abstract and forgettable. But there’s another, more insidious reason. Airline employees, like professionals in most fields, don’t particularly trust regular people. “Like police, they think of civilians as a grade below them,” says Dan Johnson, a research psychologist who has worked for the airlines in various capacities for more than three decades. At aviation conferences, he still has trouble getting experts to appreciate the human factor. “They would rather talk about hardware and training manuals—and not worry about what I consider equally important, which is the behavior of the actual people.” If the worst does happen, this distrust makes things harder still for regular people. “Often the pilots and the flight attendants do not want to inform the passengers about an emergency for fear of upsetting them,” Johnson says. “So they let them sit there in ignorance, and when the accident does happen, no one knows what the hell is going on.”

On the D.C. subway system recently, I heard this taped announcement: “In the event of a fire, remain calm and listen for instructions.” That’s it. Hundreds of conversations and thoughts were interrupted for that announcement. What was the message? That the officials who run the subway system do not trust me. They think I will dissolve into hysterics and ignore instructions in the event of a fire.

Consider what the people who created this announcement did
not
do: they had an excellent opportunity to tell me how many subway fires happen in the D.C. system each year. That would have gotten my attention. They also had a chance to explain
why
it’s almost always better to stay in the subway car in case of a fire (because the rails on the track can electrocute you, and the tunnels are, in some places, too narrow to fit through if a train is coming). But instead, they just told me not to panic. Ah, thank you so much. And here I’d been planning on panicking!

Trust is the basic building block of any effective warning system. Right now, it’s too scarce in both directions: officials don’t trust the public, and the public doesn’t trust officials either. That’s partly an unintended consequence of the way we live. “Our social and democratic institutions, admirable as they are in many respects, breed distrust,” Slovic wrote in his 2000 book,
The Perception of Risk
. A capitalist society with a free press has many things to recommend it. But it is not a place where citizens have overwhelming confidence in authority figures. Distrust makes it harder for the government to compensate for its citizens’ blind spots—one of government’s most vital functions.

Overcoming the trust deficit requires some ingenuity. But it can be done. The easiest way to mesmerize the brain is through images. Anecdotes, as any journalist or advertiser knows, always trump statistics. That’s why lottery advertisements feature individual winners basking in the glow of newfound wealth. “Ramon Valencia celebrates Father’s Day by winning a cool $1 million!” reads a California Lottery announcement. Probabilities pale in comparison to Ramon Valencia, father of four, from La Puente.

Usually, people think in binary terms: either something will happen or it won’t. Either it will affect me, or it won’t. So when people hear they have a 6 in 100,000 chance of dying from a fall, they shelve that risk under the label “won’t happen to me,” even though falling is in fact the third most common cause of accidental deaths in the United States (after car crashes and poisoning). It would be much more powerful to tell people about Grant Sheal, age three, who fell and cut himself on a vase while playing at home in February 2007. The toddler died from his injuries. Or about Patryk Rzezawski, nineteen, who fell down and hit his head that same month while walking near his home. He was pronounced dead at the scene. These deaths are almost always described in news accounts as “freak accidents,” despite the fact that they are relatively common.

When people imagine good things happening to them, they become more prone to take risks—regardless of the odds. In human brain imaging studies, part of the brain called the “ventral striatum” is highly active in gamblers and drug addicts. Within this region, something called the “nucleus accumbens” lights up when people just anticipate winning money. When this region is activated, people have a tendency to take more risks. So all a casino has to do is get you to anticipate winning—even if you never actually experience it. This might explain why casinos ply gamblers with minirewards like cheap food, free drinks, bonus points, and surprise gifts. Anticipating those rewards can activate the nucleus accumbens, which in turn can lead to more risk taking.

Another part of the brain lights up when people imagine losing. The “anterior insula” is active when people calculate the risk of bad things happening—like disasters. This region also shows activation when people are anticipating upsetting images. So it makes sense that insurance advertisements might encourage risk-averse behavior (i.e., buying policies) by activating the anterior insula through scary images.

This isn’t to say people need to be terrified into planning for disasters. Subtlety can work too. In Old Town Alexandria, Virginia, lines etched into a large renovated factory mark how high the Potomac River has risen in previous floods. At the Starbucks next door, one of the photos on the walls shows floodwaters surrounding the café and a man in a yellow rain slicker canoeing past. There are creative ways to institutionalize memory in everyday life.

In fact, it’s important not to overwhelm people with a warning that’s
too
frightening. Eric Holdeman ran King County’s Office of Emergency Management in Washington state for eleven years. He has found that there’s a fine line between getting people’s attention and losing them to a sense of futility. In 2005, an organization in his state issued a big report about what would happen if a massive earthquake occurred on the Seattle fault. The fault could deliver a 7.4 earthquake. But the report’s authors deliberately proposed a less-frightening hypothetical (a magnitude 6.7 quake, which would kill an estimated 1,660 people), betting they would get more attention. Says Holdeman: “Sometimes it’s hard to get people to do worst-case planning because the worst case is so bad. People just throw up their hands.”

But given reasonable, tangible advice, people can be very receptive. In the nation of Vanuatu, east of Australia, the residents of a remote part of Pentecost Island have no access to modern amenities. But once a week, they get to watch TV. A truck with a satellite dish, a VCR, and a TV comes to town and everyone gathers round for some entertainment. After a 1998 earthquake in Papua New Guinea, the TV truck showed a UNESCO video on how to survive a tsunami. In 1999, the islanders felt the earth shake, just like in the video, and they ran for high ground. Thirty minutes later, a giant wave inundated the town. But only three people out of five hundred died.

But all over the world, even in developing nations, officials have an unfortunate preference for high-tech gadgetry over simplicity. In coastal Bangladesh, after a 1970 cyclone killed more than three hundred thousand people, the government devised a complex warning system. Volunteers were trained to hoist flags representing one of ten different warning levels. But a 2003 survey of rural villagers found that many took no notice of the semaphore system. “I know there are disaster signals ranging from Signal No. 1 to 10,” Mohammud Nurul Islam told a team from the Benfield Hazard Research Centre, based at University College of London. “But I have no idea what they mean.” He does have his own personal survival system, however. “I can predict any disaster coming when the sky turns gloomy, bees move around in clusters, the cattle become restless, and the wind blows from the south.”

Even a child can do better than a fancy warning system, if she has been trusted with some basic information. English schoolgirl Tilly Smith was vacationing with her parents and sister in Thailand in 2004 when the tide suddenly rushed out. Tourists pointed at the fish flopping on the sand. Out on the horizon, the water began to bubble strangely, and boats bobbed up and down. Smith, ten, had just learned about tsunami in her geography class, two weeks earlier. She had watched a video of a Hawaii tsunami and learned all the signs. “Mummy, we must get off the beach now. I think there is going to be a tsunami,” she said. Her parents started warning people to leave. Then the family raced up to the JW Marriott hotel where they were staying and alerted the staff, who evacuated the rest of the beach. In the end, the beach was one of the few in Phuket where no one was killed or seriously hurt.

Other books

SNAP: New Talent by Drier, Michele
Openly Straight by Konigsberg, Bill
Carter by Kathi S. Barton
Vivaldi's Virgins by Quick, Barbara
A Dance of Cloaks by David Dalglish
A Death in Valencia by Jason Webster
Orphan X: A Novel by Gregg Hurwitz


readsbookonline.com Copyright 2016 - 2024