Read The Better Angels of Our Nature: Why Violence Has Declined Online

Authors: Steven Pinker

Tags: #Sociology, #Psychology, #Science, #Amazon.com, #21st Century, #Crime, #Anthropology, #Social History, #Retail, #Criminology

The Better Angels of Our Nature: Why Violence Has Declined (127 page)

In the wine-tasting study, Macy et al. first whipped their participants into a self-conscious lather by telling them they were part of a group that had been selected for its sophistication in appreciating fine art. The group would now take part in the “centuries-old tradition” (in fact, concocted by the experimenters) called a Dutch Round. A circle of wine enthusiasts first evaluate a set of wines, and then evaluate one another’s wine-judging abilities. Each participant was given three cups of wine and asked to grade them on bouquet, flavor, aftertaste, robustness, and overall quality. In fact, the three cups had been poured from the same bottle, and one was spiked with vinegar. As in the Asch experiment, the participants, before being asked for their own judgments, witnessed the judgments of four stooges, who rated the vinegary sample higher than one of the unadulterated samples, and rated the other one best of all. Not surprisingly, about half the participants defied their own taste buds and went with the consensus.
Then a sixth participant, also a stooge, rated the wines accurately. Now it was time for the participants to evaluate one another, which some did confidentially and others did publicly. The participants who gave their ratings confidentially respected the accuracy of the honest stooge and gave him high marks, even if they themselves had been browbeaten into conforming. But those who had to offer their ratings publicly compounded their hypocrisy by downgrading the honest rater.
The experiment on academic writing was similar, but with an additional measure at the end. The participants, all undergraduates, were told they had been selected as part of an elite group of promising scholars. They had been assembled, they learned, to take part in the venerable tradition called the Bloomsbury Literary Roundtable, in which readers publicly evaluate a text and then evaluate each other’s evaluation skills. They were given a short passage to read by Robert Nelson, Ph.D., a MacArthur “genius grant” recipient and Albert W. Newcombe Professor of Philosophy at Harvard University. (There is no such professor or professorship.) The passage, called “Differential Topology and Homology,” had been excerpted from Alan Sokal’s “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity.” The essay was in fact the centerpiece of the famous Sokal Hoax, in which the physicist had written a mass of gobbledygook and, confirming his worst suspicions about scholarly standards in the postmodernist humanities, got it published in the prestigious journal
Social Text.
284
The participants, to their credit, were not impressed by the essay when they rated it in private. But when they rated it in public after seeing four stooges give it glowing evaluations, they gave it high evaluations too. And when they then rated their fellow raters, including an honest sixth one who gave the essay the low rating it deserved, they gave him high marks in private but low marks in public. Once again the sociologists had demonstrated that people not only endorse an opinion they do not hold if they mistakenly believe everyone else holds it, but they falsely condemn someone else who fails to endorse the opinion. The extra step in this experiment was that Macy et al. got a new group of participants to rate whether the first batch of participants had sincerely believed that the nonsensical essay was good. The new raters judged that the ones who condemned the honest rater were more sincere in their misguided belief than the ones who chose not to condemn him. It confirms Macy’s suspicion that enforcement of a belief is perceived as a sign of sincerity, which in turn supports the idea that people enforce beliefs they don’t personally hold to make themselves look sincere. And that, in turn, supports their model of pluralistic ignorance, in which a society can be taken over by a belief system that the majority of its members do not hold individually.
 
It’s one thing to say that a sour wine has an excellent bouquet or that academic balderdash is logically coherent. It’s quite another to confiscate the last bit of flour from a starving Ukrainian peasant or to line up Jews at the edge of a pit and shoot them. How could ordinary people, even if they were acquiescing to what they thought was a popular ideology, overcome their own consciences and perpetrate such atrocities?
The answer harks back to the Moralization Gap. Perpetrators always have at their disposal a set of self-exculpatory stratagems that they can use to reframe their actions as provoked, justified, involuntary, or inconsequential. In the examples I mentioned in introducing the Moralization Gap, perpetrators rationalize a harm they committed out of self-interested motives (reneging on a promise, robbing or raping a victim). But people also rationalize harms they have been pressured into committing in the service of someone
else’s
motives. They can edit their beliefs to make the action seem justifiable to themselves, the better to justify it to others. This process is called cognitive dissonance reduction, and it is a major tactic of self-deception.
285
Social psychologists like Milgram, Zimbardo, Baumeister, Leon Festinger, Albert Bandura, and Herbert Kelman have documented that people have many ways of reducing the dissonance between the regrettable things they sometimes do and their ideal of themselves as moral agents.
286
One of them is euphemism—the reframing of a harm in words that somehow make it feel less immoral. In his 1946 essay “Politics and the English Language,” George Orwell famously exposed the way governments could cloak atrocities in bureaucratese:
In our time, political speech and writing are largely the defense of the indefensible. Things like the continuance of British rule in India, the Russian purges and deportations, the dropping of the atom bombs on Japan, can indeed be defended, but only by arguments which are too brutal for most people to face, and which do not square with the professed aims of the political parties. Thus political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness. Defenseless villages are bombarded from the air, the inhabitants driven out into the countryside, the cattle machine-gunned, the huts set on fire with incendiary bullets: this is called
pacification
. Millions of peasants are robbed of their farms and sent trudging along the roads with no more than they can carry: this is called
transfer of population
or
rectification of frontiers
. People are imprisoned for years without trial, or shot in the back of the neck or sent to die of scurvy in Arctic lumber camps: this is called
elimination of unreliable elements
. Such phraseology is needed if one wants to name things without calling up mental pictures of them.
287
 
Orwell was wrong about one thing: that political euphemism was a phenomenon of his time. A century and a half before Orwell, Edmund Burke complained about the euphemisms emanating from revolutionary France:
The whole compass of the language is tried to find sinonimies and circumlocutions for massacres and murder. Things are never called by their common names. Massacre is sometimes called
agitation
, sometimes
effervescence
, sometimes
excess;
sometimes
too continued an exercise of a revolutionary power
.
288
 
Recent decades have seen, to take just a few examples,
collateral damage
(from the 1970s),
ethnic cleansing
(from the 1990s), and
extraordinary rendition
(from the 2000s).
Euphemisms can be effective for several reasons. Words that are literal synonyms may contrast in their emotional coloring, like
slender
and
skinny
,
fat
and
Rubenesque,
or an obscene word and its genteel synonym. In
The Stuff of Thought
I argued that most euphemisms work more insidiously: not by triggering reactions to the words themselves but by engaging different conceptual interpretations of the state of the world.
289
For example, a euphemism can confer plausible deniability on what is essentially a lie. A listener unfamiliar with the facts could understand
transfer of population
to imply moving vans and train tickets. A choice of words can also imply different motives and hence different ethical valuations.
Collateral damage
implies that a harm was an unintended by-product rather than a desired end, and that makes a legitimate moral difference. One could almost use
collateral damage
with a straight face to describe the hapless worker on the side track who was sacrificed to prevent the runaway trolley from killing five workers on the main one. All of these phenomena—emotional connotation, plausible deniability, and the ascription of motives—can be exploited to alter the way an action is construed.
A second mechanism of moral disengagement is gradualism. People can slide into barbarities a baby step at a time that they would never undertake in a single plunge, because at no point does it feel like they are doing anything terribly different from the current norm.
290
An infamous historical example is the Nazis’ euthanizing of the handicapped and mentally retarded and their disenfranchisement, harassment, ghettoization, and deportation of the Jews, which culminated in the events referred to by the ultimate euphemism,
the Final Solution
. Another example is the phasing of decisions in the conduct of war. Material assistance to an ally can morph into military advisors and then into escalating numbers of soldiers, particularly in a war of attrition. The bombing of factories can shade into the bombing of factories near neighborhoods, which can shade into the bombing of neighborhoods. It’s unlikely that any participant in the Milgram experiment would have zapped the victim with a 450-volt shock on the first trial; the participants were led to that level in an escalating series, starting with a mild buzz. Milgram’s experiment was what game theorists call an Escalation game, which is similar to a War of Attrition.
291
If the participant withdraws from the experiment as the shocks get more severe, he forfeits whatever satisfaction he might have enjoyed from carrying out his responsibilities and advancing the cause of science and thus would have nothing to show for the anxiety he has suffered and the pain he has caused the victim. At each increment, it always seems to pay to stick it out one trial longer and hope that the experimenter will announce that the study is complete.
A third disengagement mechanism is the displacement or diffusion of responsibility. Milgram’s mock experimenter always insisted to the participants that he bore full responsibility for whatever happened. When the patter was rewritten and the participant was told that he or she was responsible, compliance plummeted. We have already seen that a second willing participant emboldens the first; Bandura showed that the diffusion of responsibility is a critical factor.
292
When participants in a Milgram-like experiment think that the voltage they have chosen will be averaged with the levels chosen by two other participants, they give stronger shocks. The historical parallels are obvious. “I was only following orders” is the clichéd defense of accused war criminals. And murderous leaders deliberately organize armies, killing squads, and the bureaucracies behind them in such a way that no single person can feel that his actions are necessary or sufficient for the killings to occur.
293
A fourth way of disabling the usual mechanisms of moral judgment is distancing. We have seen that unless people are in the throes of a rampage or have sunk into sadism, they don’t like harming innocent people directly and up close.
294
In the Milgram studies, bringing the victim into the same room as the participant reduced the proportion of participants who delivered the maximum shock by a third. And requiring the participant to force the victim’s hand down onto an electrode plate reduced it by more than half. It’s safe to say that the pilot of the
Enola Gay
who dropped the atomic bomb over Hiroshima would not have agreed to immolate a hundred thousand people with a flamethrower one at a time. And as we saw in chapter 5, Paul Slovic has confirmed the observation attributed to Stalin that one death is a tragedy but a million deaths is a statistic.
295
People cannot wrap their minds around large (or even small) numbers of people in peril, but will readily mobilize to save the life of a single person with a name and a face.
A fifth means of jiggering the moral sense is to derogate the victim. We have seen that demonizing and dehumanizing a group can pave the way toward harming its members. Bandura confirmed this sequence by allowing some of his participants to overhear the experimenter making an offhand derogatory remark about the ethnicity of a group of people who (they thought) were taking part in the study.
296
The participants who overheard the remark upped the voltage of the shocks they gave to those people. The causal arrow can go in the other direction as well. If people are manipulated into harming someone, they can retroactively downgrade their opinion of the people they have harmed. Bandura found that about half of the participants who had shocked a victim explicitly justified their action. Many did so by blaming the victim (a phenomenon Milgram had noticed as well), writing things like “Poor performance is indicative of laziness and a willingness to test the supervisor.”

Other books

For the Love of Suzanne by Hudecek-Ashwill, Kristi
Live and Learn by Niobia Bryant
She Can Scream by Melinda Leigh
Bird Brained by Jessica Speart
Margaret Moore - [Warrior 14] by In The Kings Service
Touching the Surface by Kimberly Sabatini
The Comfort of Lies by Randy Susan Meyers
Forgiven by Jana Oliver


readsbookonline.com Copyright 2016 - 2024