Read The Better Angels of Our Nature: Why Violence Has Declined Online

Authors: Steven Pinker

Tags: #Sociology, #Psychology, #Science, #Amazon.com, #21st Century, #Crime, #Anthropology, #Social History, #Retail, #Criminology

The Better Angels of Our Nature: Why Violence Has Declined (120 page)

In the first such tournament, hosted by the political scientist Robert Axelrod, the winner was a simple strategy of Tit for Tat: cooperate on the first move, then continue to cooperate if your partner cooperates, but defect if he defects.
184
Since cooperation is rewarded and defection punished, defectors will switch to cooperation, and in the long run everyone wins. The idea is identical to Robert Trivers’s theory of the evolution of reciprocal altruism, which he had proposed a few years earlier without the mathematical paraphernalia.
185
The positive-sum reward arises from gains in exchange (each can confer a large benefit to the other at a small cost to itself), and the temptation is to exploit the other, taking the benefit without paying the cost. Trivers’s theory that the moral emotions are adaptations to cooperation can be translated directly into the Tit for Tat algorithm. Sympathy is cooperating on the first move. Gratitude is cooperating with a cooperator. And anger is defecting against a defector—in other words, punishing in revenge. The punishment can consist in refusing to help, but it can also consist in causing a harm. Vengeance is no disease: it is necessary for cooperation, preventing a nice guy from being exploited.
Hundreds of Iterated Prisoner’s Dilemma tournaments have been studied since then, and a few new lessons have emerged.
186
One is that Tit for Tat, simple though it is, can be dissected into features that account for its success and can be recombined into other strategies. These features have been named for personality traits, and the labels may be more than mnemonics; the dynamics of cooperation may explain why we evolved those traits. The first feature behind Tit for Tat’s success is that it is
nice:
it cooperates on the first move, thereby tapping opportunities for mutually beneficial cooperation, and it does not defect unless defected upon. The second is that it is
clear:
if a strategy’s rules of engagement are so complicated that the other players cannot discern how it is reacting to what they do, then its moves are effectively arbitrary, and if they are arbitrary, the best response is the strategy Always Defect. Tit for Tat is easy for other strategies to cotton on to, and they can adjust their choices in response. Third, Tit for Tat is
retaliatory:
it responds to defection with defection, the simplest form of revenge. And it is
forgiving:
it leaves the gates of repentance open, so if its adversary switches to cooperation after a history of defection, Tit for Tat immediately cooperates in return.
187
The last feature, forgivingness, turns out to be more important than everyone first appreciated. A weakness of Tit for Tat is that it is vulnerable to error and misunderstanding. Suppose one of the players intends to cooperate but defects by mistake. Or suppose it misperceives another player’s cooperation as defection, and defects in retaliation. Then its opponent will defect in retaliation, which forces it to retaliate in turn, and so on, dooming the players to an endless cycle of defection—the software equivalent of a feud. In a noisy world in which misunderstanding and error are possible, Tit for Tat is bested by an even more forgiving strategy called Generous Tit for Tat. Every once in a while Generous Tit for Tat will randomly grant forgiveness to a defector and resume cooperating. The act of unconditional forgiveness can flick a duo that has been trapped in a cycle of mutual defection back onto the path of cooperation.
A problem for overly forgiving strategies, though, is that they can be undone if the population contains a few psychopaths who play Always Defect and a few suckers who play Always Cooperate. The psychopaths proliferate by preying on the suckers, and then become numerous enough to exploit everyone else. One successful contender in such a world is Contrite Tit for Tat, which is more discriminating in its forgiveness. It remembers its
own
behavior, and if a round of mutual defection had been its fault because of a random error or misunderstanding, it allows its opponent one free defection and then switches to cooperation. But if the defection had been triggered by its opponent, it shows no mercy and retaliates. If the opponent is a Contrite Tit for Tatter as well, then it will excuse the justified retaliation, and the pair will settle back into cooperation. So not just vengeance, but also forgiveness and contrition, are necessary for social organisms to reap the benefits of cooperation.
The evolution of cooperation critically depends on the possibility of repeated encounters. It cannot evolve in a one-shot Prisoner’s Dilemma, and it collapses even in an Iterated Prisoner’s Dilemma if the players know they are playing a limited number of rounds, because as the end of the game approaches, each is tempted to defect without fear of retribution. For similar reasons, subsets of players who are stuck with playing against each other—say because they are neighbors who cannot move—tend to be more forgiving than ones who can pick up and choose another neighborhood in which to find partners. Cliques, organizations, and other social networks are virtual neighborhoods because they force groups of people to interact repeatedly, and they too tilt people toward forgiveness, because mutual defection would be ruinous to everyone.
Human cooperation has another twist. Because we have language, we don’t have to deal with people directly to learn whether they are cooperators or defectors. We can ask around, and find out through the grapevine how the person has behaved in the past. This indirect reciprocity, as game theorists call it, puts a tangible premium on reputation and gossip.
188
Potential cooperators have to balance selfishness against mutual benefit not just in dealing with each other in pairs but when acting collectively in groups. Game theorists have explored a multiplayer version of the Prisoner’s Dilemma called the Public Goods game.
189
Each player can contribute money toward a common pool, which then is doubled and divided evenly among the players. (One can imagine a group of fishermen chipping in for harbor improvements such as a lighthouse, or merchants in a block of stores pooling contributions for a security guard.) The best outcome for the group is for everyone to contribute the maximum amount. But the best outcome for an
individual
is to stint on his own contribution and be a free rider on the profits from everyone else’s. The tragedy is that contributions will dwindle to zero and everyone ends up worse off. (The biologist Garrett Hardin proposed an identical scenario called the Tragedy of the Commons. Each farmer cannot resist grazing his own cow on the town commons, stripping it bare to everyone’s loss. Pollution, overfishing, and carbon emissions are equivalent real-life examples.)
190
But if players have the opportunity to punish free riders, as if in revenge for their exploitation of the group, then the players have an incentive to contribute, and everyone profits.
The modeling of the evolution of cooperation has become increasingly byzantine, because so many worlds can be simulated so cheaply. But in the most plausible of these worlds, we see the evolution of the all-too-human phenomena of exploitation, revenge, forgiveness, contrition, reputation, gossip, cliques, and neighborliness.
 
So does revenge pay in the real world? Does the credible threat of punishment induce fear in the heart of potential exploiters and deter them from exploiting? The answer from the lab is yes.
191
When people actually act out Prisoner’s Dilemma games in experiments, they tend toward Tit for Tat–like strategies and end up enjoying the fruits of cooperation. When they play the Trust game (another version of Prisoner’s Dilemma, which was the game used in the neuroimaging experiments on revenge), the ability of an investor to punish a faithless trustee puts enough fear into the trustee to return a fair share of the appreciated investment. In Public Goods games, when people are given the opportunity to punish free riders, people don’t free-ride. And remember the studies in which participants’ essays were savaged and they had an opportunity to shock their critics in revenge? If they knew that the critic would then get a turn to shock them back—to take revenge for the revenge—they held back on the intensity of the shocks.
192
Revenge can work as a deterrent only if the avenger has a
reputation
for being willing to avenge and a willingness to carry it out even when it is costly. That helps explain why the urge for revenge can be so implacable, consuming, and sometimes self-defeating (as with pursuers of self-help justice who slay an unfaithful spouse or an insulting stranger). 193 Moreover, it is most effective when the target knows that the punishment came from the avenger so he can recalibrate his behavior toward the avenger in the future.
194
That explains why an avenger’s thirst is consummated only when the target knows he has been singled out for the punishment.
195
These impulses implement what judicial theorists call specific deterrence: a punishment is targeted at a wrongdoer to prevent him from repeating a crime.
The psychology of revenge also implements what judicial theorists call general deterrence: a publicly decreed punishment that is designed to scare third parties away from the temptations of crime. The psychological equivalent of general deterrence is the cultivation of a reputation for being the kind of person who cannot be pushed around. (You don’t tug on Superman’s cape; you don’t spit into the wind; you don’t pull the mask off the old Lone Ranger; and you don’t mess around with Jim.) Experiments have shown that people punish more severely, even at a price that is greater than the amount out of which they have been cheated, when they think an audience is watching.
196
And as we saw, men are twice as likely to escalate an argument into a fight when spectators are around.
197
The effectiveness of revenge as a deterrent can explain actions that are otherwise puzzling. The rational actor theory, popular in economics and political science, has long been embarrassed by people’s behavior in yet another game, the Ultimatum game.
198
One participant, the proposer, gets a sum of money to divide between himself and another participant, the acceptor, who can take it or leave it. If he leaves it, neither side gets anything. A rational proposer would keep the lion’s share; a rational respondent would accept the remaining crumbs, no matter how small, because part of a loaf is better than none. In actual experiments the proposer tends to offer almost half of the jackpot, and the respondent doesn’t settle for much less than half, even though turning down a smaller share is an act of spite that punishes both participants. Why do actors in these experiments behave so irrationally? The rational actor theory had neglected the psychology of revenge. When a proposal is too stingy, the respondent gets angry—indeed, the neuroimaging study I mentioned earlier, in which the insula lit up in anger, used the Ultimatum game to elicit it.
199
The anger impels the respondent to punish the proposer in revenge. Most proposers anticipate the anger, so they make an offer that is just generous enough to be accepted. When they don’t have to worry about revenge, because the rules of the game are changed and the acceptor has to accept the split no matter what (a variation called the Dictator game), the offer is stingier.
 
We still have a puzzle. If revenge evolved as a deterrent, then why is it used so often in the real world? Why doesn’t revenge work like the nuclear arsenals in the Cold War, creating a balance of terror that keeps everyone in line? Why should there ever be cycles of vendetta, with vengeance begetting vengeance?
A major reason is the Moralization Gap. People consider the harms they inflict to be justified and forgettable, and the harms they suffer to be unprovoked and grievous. This bookkeeping makes the two sides in an escalating fight count the number of strikes differently and weigh the inflicted harm differently as well.
200
As the psychologist Daniel Gilbert has put it, the two combatants in a long-running war often sound like a pair of boys in the backseat of a car making their respective briefs to their parents: “He hit me first!” “He hit me harder!”
201

Other books

A Matter of Souls by Denise Lewis Patrick
Consumed by David Cronenberg
Cyborg Doms: Fane by H.C. Brown
Sculpting a Demon by Fox, Lisa
Ubik by Philip K. Dick
The Tide: Deadrise by Melchiorri, Anthony J


readsbookonline.com Copyright 2016 - 2024