Solomon suggested that our emotional state, like our perception of the color of the world, is kept in equilibrium by a balance of opposing circuits. Fear is in balance with reassurance, euphoria with depression, hunger with satiety. The main difference between opposing emotions and complementary colors is in how they change with experience. With the emotions, a person’s initial reaction gets weaker over time, and the balancing impulse gets stronger. As an experience is repeated, the emotional rebound is more keenly felt than the emotion itself. The first leap in a bungee jump is terrifying, and the sudden yoiiiiing of deceleration exhilarating, followed by an interlude of tranquil euphoria. But with repeated jumps the reassurance component strengthens, which makes the fear subside more quickly and the pleasure arrive earlier. If the most concentrated moment of pleasure is the sudden reversal of panic by reassurance, then the weakening of the panic response over time may require the jumper to try increasingly dangerous jumps to get the same degree of exhilaration. The action-reaction dynamic may be seen with positive initial experiences as well. The first hit of heroin is euphoric, and the withdrawal mild. But as the person turns into a junkie, the pleasure lessens and the withdrawal symptoms come earlier and are more unpleasant, until the compulsion is less to attain the euphoria than to avoid the withdrawal.
According to Baumeister, sadism follows a similar trajectory.
259
An aggressor experiences a revulsion to hurting his victim, but the discomfort cannot last forever, and eventually a reassuring, energizing counteremotion resets his equilibrium to neutral. With repeated bouts of brutality, the reenergizing process gets stronger and turns off the revulsion earlier. Eventually it predominates and tilts the entire process toward enjoyment, exhilaration, and then craving. As Baumeister puts it, the pleasure is in the backwash.
By itself the opponent-process theory is a bit too crude, predicting, for example, that people would hit themselves over the head because it feels so good when they stop. Clearly not all experiences are governed by the same tension between reaction and counterreaction, nor by the same gradual weakening of the first and strengthening of the second. There must be a subset of aversive experiences that especially lend themselves to being overcome. The psychologist Paul Rozin has identified a syndrome of acquired tastes he calls benign masochism.
260
These paradoxical pleasures include consuming hot chili peppers, strong cheese, and dry wine, and partaking in extreme experiences like saunas, skydiving, car racing, and rock climbing. All of them are adult tastes, in which a neophyte must overcome a first reaction of pain, disgust, or fear on the way to becoming a connoisseur. And all are acquired by controlling one’s exposure to the stressor in gradually increasing doses. What they have in common is a coupling of high potential gains (nutrition, medicinal benefits, speed, knowledge of new environments) with high potential dangers (poisoning, exposure, accidents). The pleasure in acquiring one of these tastes is the pleasure of pushing the outside of the envelope: of probing, in calibrated steps, how high, hot, strong, fast, or far one can go without bringing on disaster. The ultimate advantage is to open up beneficial regions in the space of local experiences that are closed off by default by innate fears and cautions. Benign masochism is an overshooting of this motive of mastery, and as Solomon and Baumeister point out, the revulsion-overcoming process can overshoot so far as to result in craving and addiction. In the case of sadism, the potential benefits are dominance, revenge, and sexual access, and the potential dangers are reprisals from the victim or victim’s allies. Sadists do become connoisseurs—the instruments of torture in medieval Europe, police interrogation centers, and the lairs of serial killers can be gruesomely sophisticated—and sometimes they can become addicts.
The fact that sadism is an acquired taste is both frightening and hopeful. As a pathway prepared by the motivational systems of the brain, sadism is an ever-present danger to individuals, security forces, or subcultures who take the first step and can proceed to greater depravity in secrecy. Yet it does have to be acquired, and if those first steps are blocked and the rest of the pathway bathed in sunlight, the path to sadism can be foreclosed.
IDEOLOGY
Individual people have no shortage of selfish motives for violence. But the really big body counts in history pile up when a large number of people carry out a motive that transcends any one of them: an ideology. Like predatory or instrumental violence, ideological violence is a means to an end. But with an ideology, the end is idealistic: a conception of the greater good.
261
Yet for all that idealism, it’s ideology that drove many of the worst things that people have ever done to each other. They include the Crusades, the European Wars of Religion, the French Revolutionary and Napoleonic Wars, the Russian and Chinese civil wars, the Vietnam War, the Holocaust, and the genocides of Stalin, Mao, and Pol Pot. An ideology can be dangerous for several reasons. The infinite good it promises prevents its true believers from cutting a deal. It allows any number of eggs to be broken to make the utopian omelet. And it renders opponents of the ideology infinitely evil and hence deserving of infinite punishment.
We have already seen the psychological ingredients of a murderous ideology. The cognitive prerequisite is our ability to think through long chains of means-ends reasoning, which encourage us to carry out unpleasant means as a way to bring about desirable ends. After all, in some spheres of life the ends really do justify the means, such as the bitter drugs and painful procedures we undergo as part of a medical treatment. Means-ends reasoning becomes dangerous when the means to a glorious end include harming human beings. The design of the mind can encourage the train of theorization to go in that direction because of our drives for dominance and revenge, our habit of essentializing other groups, particularly as demons or vermin, our elastic circle of sympathy, and the self-serving biases that exaggerate our wisdom and virtue. An ideology can provide a satisfying narrative that explains chaotic events and collective misfortunes in a way that flatters the virtue and competence of believers, while being vague or conspiratorial enough to withstand skeptical scrutiny.
262
Let these ingredients brew in the mind of a narcissist with a lack of empathy, a need for admiration, and fantasies of unlimited success, power, brilliance, and goodness, and the result can be a drive to implement a belief system that results in the deaths of millions.
But the puzzle in understanding ideological violence is not so much psychological as epidemiological: how a toxic ideology can spread from a small number of narcissistic zealots to an entire population willing to carry out its designs. Many ideological beliefs, in addition to being evil, are patently ludicrous—ideas that no sane person would ever countenance on his or her own. Examples include the burning of witches because they sank ships and turned men into cats, the extermination of every last Jew in Europe because their blood would pollute the Aryan race, and the execution of Cambodians who wore eyeglasses because it proved they were intellectuals and hence class enemies. How can we explain extraordinary popular delusions and the madness of crowds?
Groups can breed a number of pathologies of thought. One of them is polarization. Throw a bunch of people with roughly similar opinions into a group to hash them out, and the opinions will become more similar to one another, and more extreme as well.
263
The liberal groups become more liberal; the conservative groups more conservative. Another group pathology is obtuseness, a dynamic that the psychologist Irving Janis called groupthink.
264
Groups are apt to tell their leaders what they want to hear, to suppress dissent, to censor private doubts, and to filter out evidence that contradicts an emerging consensus. A third is animosity between groups.
265
Imagine being locked in a room for a few hours with a person whose opinions you dislike—say, you’re a liberal and he or she is a conservative or vice versa, or you sympathize with Israel and the other person sympathizes with the Palestinians or vice versa. Chances are the conversation between the two of you would be civil, and it might even be warm. But now imagine that there are six on your side and six on the other. There would probably be a lot of hollering and red faces and perhaps a small riot. The overall problem is that groups take on an identity of their own in people’s minds, and individuals’ desire to be accepted within a group, and to promote its standing in comparison to other groups, can override their better judgment.
Even when people are not identifying with a well-defined group, they are enormously influenced by the people around them. One of the great lessons of Stanley Milgram’s experiments on obedience to authority, widely appreciated by psychologists, is the degree to which the behavior of the participants depended on the immediate social milieu.
266
Before he ran the experiment, Milgram polled his colleagues, students, and a sample of psychiatrists on how far they thought the participants would go when an experimenter instructed them to shock a fellow participant. The respondents unanimously predicted that few would exceed 150 volts (the level at which the victim demands to be freed), that just 4 percent would go up to 300 volts (the setting that bore the warning “Danger: Severe Shock”), and that only a handful of psychopaths would go all the way to the highest shock the machine could deliver (the setting labeled “450 Volts—XXX”). In fact,
65 percent
of the participants went all the way to the maximum shock, long past the point when the victim’s agonized protests had turned to an eerie silence. And they might have kept on shocking the presumably comatose subject (or his corpse) had the experimenter not brought the proceedings to a halt. The percentage barely budged with the sex, age, or occupation of the participants, and it varied only a small amount with their personalities. What did matter was the physical proximity of other people and how they behaved. When the experimenter was absent and his instructions were delivered over the telephone or in a recorded message, obedience fell. When the victim was in the same room instead of an adjacent booth, obedience fell. And when the participant had to work in tandem with a second participant (a confederate of the experimenter), then if the confederate refused to comply, so did the participant. But when the confederate complied, more than 90 percent of the time the participant did too.
People take their cues on how to behave from other people. This is a major conclusion of the golden age of social psychology, when experiments were a kind of guerrilla theater designed to raise consciousness about the dangers of mindless conformity. Following a 1964 news report—almost entirely apocryphal—that dozens of New Yorkers watched impassively as a woman named Kitty Genovese was raped and stabbed to death in their apartment courtyard, the psychologists John Darley and Bibb Latané conducted a set of ingenious studies on so-called bystander apathy.
267
The psychologists suspected that groups of people might fail to respond to an emergency that would send an isolated person leaping to action because in a group, everyone assumes that if no one else is doing anything, the situation couldn’t be all that dire. In one experiment, as a participant was filling out a questionnaire, he or she heard a loud crash and a voice calling out from behind a partition: “Oh . . . my foot . . . I . . . can’t move it; oh . . . my ankle . . . I can’t get this thing off me.” Believe it or not, if the participant was sitting with a confederate who continued to fill out the questionnaire as if nothing was happening, 80 percent of the time the participant did nothing too. When the participants were alone, only 30 percent failed to respond.
People don’t even need to witness other people behaving callously to behave in uncharacteristically callous ways. It is enough to place them in a fictive group that is defined as being dominant over another one. In another classic psychology-experiment-cum-morality-play (conducted in 1971, before committees for the protection of human subjects put the kibosh on the genre), Philip Zimbardo set up a mock prison in the basement of the Stanford psychology department, divided the participants at random into “prisoners” and “guards,” and even got the Palo Alto police to arrest the prisoners and haul them to the campus hoosegow.
268
Acting as the prison superintendent, Zimbardo suggested to the guards that they could flaunt their power and instill fear in the prisoners, and he reinforced the atmosphere of group dominance by outfitting the guards with uniforms, batons, and mirrored sunglasses while dressing the prisoners in humiliating smocks and stocking caps. Within two days some of the guards took their roles too seriously and began to brutalize the prisoners, forcing them to strip naked, clean toilets with their bare hands, do push-ups with the guards standing on their backs, or simulate sodomy. After six days Zimbardo had to call off the experiment for the prisoners’ safety. Decades later Zimbardo wrote a book that analogized the unplanned abuses in his own faux prison to the unplanned abuses at the Abu Ghraib prison in Iraq, arguing that a situation in which a group of people is given authority over another group can bring out barbaric behavior in individuals who might never display it in other circumstances.
Many historians of genocide, like Christopher Browning and Benjamin Valentino, have invoked the experiments of Milgram, Darley, Zimbardo, and other social psychologists to make sense of the puzzling participation, or at least acquiescence, of ordinary people in unspeakable atrocities. Bystanders often get caught up in the frenzy around them and join in the looting, gang rapes, and massacres. During the Holocaust, soldiers and policemen rounded up unarmed civilians, lined them up in front of pits, and shot them to death, not out of animus to the victims or a commitment to Nazi ideology but so that they would not shirk their responsibilities or let down their brothers-in-arms. Most of them were not even coerced by a threat of punishment for insubordination. (My own experience in carrying out instructions to shock a laboratory rat against my better judgment makes this disturbing claim utterly believable to me.) Historians have found few if any cases in which a German policeman, soldier, or guard suffered a penalty for refusing to carry out the Nazis’ orders.
269
As we shall see in the next chapter, people even
moralize
conformity and obedience. One component of the human moral sense, amplified in many cultures, is the elevation of conformity and obedience to praiseworthy virtues.