Read Alone Together Online

Authors: Sherry Turkle

Alone Together (20 page)

After Leon has been with Cog for about an hour, the boy becomes preoccupied with whether he has spent enough time with Cog to make a lasting impression. His thoughts return to the tall blond researcher who “gets to be with Cog all the time.” Leon is sure that Cog is in love with her. Leon chides her: “He keeps looking at you. He is in love with you.” Leon then settles on a new idea: “Cog is a boy and so obviously likes girls more than boys.” This at least is a reason why he doesn’t stand a chance here. Leon wonders whether he might have more success with Kismet, which the children usually see as a female because of its doll eyes, red lips, and long eyelashes.
Most children find a way to engage with a faltering robot, imagining themselves as parents or teachers or healers. But both Estelle and Leon became depressed when they were not “recognized.” Other frustrated children persevere in anger. Edward, six, is small for his age. What he lacks in size he makes up for in energy. From the start, he announces that he wants to be the “best at everything about the robots.” His father tells us that at home and at school, Edward likes to be “in charge.” He plays rough and gets into fights. With no prologue, Edward walks up to Kismet and asks, “Can you talk?” When Kismet doesn’t answer, Edward repeats his question at greater volume. Kismet stares into space. Again, Edward asks, “Can you talk?” Now, Kismet speaks in the emotionally layered babble that has delighted other children or puzzled them into inventive games. This is not Edward’s reaction to this winsome speaker of nonsense. He tries to understand Kismet: “What?” “Say that again?” “What exactly?” “Huh? What are you saying?” After a few minutes, Edward decides that Kismet is making no sense. He tells the robot, “Shut up!” And then, Edward picks up objects in the laboratory and forces them into Kismet’s mouth—first a metal pin, then a pencil, then a toy caterpillar. Edward yells, “Chew this! Chew this!” Absorbed by hostility, her remains engaged with the robot.
Shawn, six years older than Edward, has a similar reaction. He visits the lab with his two younger brothers on whom he rains insults as they all wait to visit the robots. When Shawn meets Kismet, he calms down, and his tone is friendly: “What’s your name?” But when Kismet is silent, Shawn becomes enraged. He covers the cameras that serve as Kismet’s eyes and orders, “Say something!” Kismet remains silent. Shawn sits silently too, staring at Kismet as though sizing up an opponent. Suddenly, he shouts, “Say, ‘Shut up!’ Say, ‘Shut up!’” “Say, ‘Hi!’ . . . Say, ‘Blah!’” The adults in the room are silent; we gave the children no rules about what they could and could not say. Suddenly, Kismet says, “Hi.” Shawn smiles and tries to get Kismet to speak again. When Kismet does not respond, Shawn forces his pen into Kismet’s mouth. “Here! Eat this pen!” Shawn, like Edward, does not tire of this exercise.
One way to look at Estelle and Leon, Edward and Shawn is to say that these children are particularly desperate for attention, control, and a sense of connection. And so, when the robots disappoint, they are more affected than other children. Of course, this is true. But this explanation puts the full burden on the children. Another way to look at their situation puts more of the burden on us. What would we have given to these children if the robots had been in top form? In the cases of Edward and Shawn, we have two “class bullies,” the kids everyone is afraid of. But these boys are lonely. As bullies, they are isolated, often alone or surrounded by children who are not friends but whom they simply boss around. They see robots as powerful, technological, and probably expensive. It is exciting to think about controlling something like that. For them, a sociable robot is a possible friend—one that would not ask for too much in return and would never reject them, but in whom they might confide. But like the insecure Estelle and Leon, these are the children who most need relationships that will model mutuality, where control is not the main thing on the table. Why do we propose machine companionship to them in the first place? From this perspective, problems aren’t limited to when the robots break down. Vulnerable children are not helped even when the robots are doing just fine.
AGAIN, ON AN ETHICAL TERRAIN
 
In the robot laboratory, children are surrounded by adults talking to and teaching robots. The children quickly understand that Cog needs Brian Scassellati and Kismet needs Cynthia Breazeal. The children imagine Scassellati and Breazeal to be the robots’ parents. Both are about to leave the Artificial Intelligence laboratory, where they have been graduate students, and move on to faculty positions.
Breazeal will be staying at MIT but leaving the AI Lab for the Media Lab. The two are down the street from each other, but the tradition of academic property rights demands that Kismet, like Cog, be left behind in the laboratory that paid for its development. The summer of the first-encounters study is the last time Breazeal will have access to Kismet. Breazeal describes a sharp sense of loss. Building a new Kismet will not be the same. This is the Kismet she has “raised” from a “child.” She says she would not be able to part with Kismet if she weren’t sure it would remain with people who would treat it well.
It comes as no surprise that separation is not easy for Breazeal; more striking is how hard it is for those around Kismet to imagine the robot without her. A ten-year-old who overhears a conversation among graduate students about how Kismet will remain in the lab quietly objects, “But Cynthia is Kismet’s mother.”
12
Watching Breazeal interact with Kismet, one does sense a maternal connection, one that Breazeal describes as “going beyond its being a mere machine.” She knows Kismet’s every move, and yet, she doesn’t. There are still surprises that delight. Her experience calls to mind a classic science fiction story by Brian Aldiss, “Supertoys Last All Summer Long,” best known through its movie adaptation, the Steven Spielberg film
A.I.: Artificial Intelligence
.
13
In
A.I
., scientists build a humanoid robot, David, who is programmed to love. David expresses his love to a woman, Monica, who has adopted him as her child.
The pressing issue raised by this film is not the potential reality of a robot that “loves”—we are far from building anything like the robot David—but how Monica’s feelings come about. Monica is a human being who responds to a machine that asks for nurturance by caring for it. Her response to a robot that reaches out to her is confusion mixed with love and attachment.
It would be facile to make a simple analogy between Breazeal’s situation and that of Monica in
A.I.
, but Breazeal is, in fact, one of the first people to have one of the signal experiences in that story—sadness caused by separation from a robot to which one has formed an attachment based on nurturance. At issue here is not Kismet’s achieved level of intelligence but Breazeal’s journey: in a very limited sense, Breazeal “brought up” Kismet. But even that very limited experience provokes strong emotion. Being asked to nurture a machine constructs us as its parents. This new relationship creates its own loop, drawing us into the complicities that make it possible. We are asked to nurture. We want to help. We become open to playing along, willing to defer to what the robot is able to do.
In fiction and myth, human beings imagine themselves “playing God” and creating new forms of life. Now, in the real, sociable robots suggest a new dynamic. We have created something that we relate to as an “other,” an equal, not something over which we wield godlike power. As these robots get more sophisticated—more refined in their ability to target us—these feelings grow stronger. We are drawn by our humanity to give to these machines something of the consideration we give to each other. Because we reach for mutuality, we want them to care about us as we care for them. They can hurt us.
I noted earlier the chilling credibility of the interaction between Madison and Kismet and the desperation of children who seem to need these robots too much. Cog and Kismet are successful in getting children to relate to them “for real.” It is the robots’ success that gives me pause, as does the prospect of “conversations” between the most needy among us—the disadvantaged young, the deprived elderly, the emotionally and physically disabled—and ever more lifelike sociable robots. Roboticists want us to consider a “best-case” scenario in which robotic companions serve as mentors, first steps toward more complex encounters. Even My Real Baby was marketed as a robot that could teach your child “socialization.” I am skeptical. I believe that sociable technology will always disappoint because it promises what it cannot deliver. It promises friendship but can only deliver performances. Do we really want to be in the business of manufacturing friends that will never be friends?
Roboticists will argue that there is no harm in people engaging in conversations with robots; the conversations may be interesting, fun, educational, or comforting. But I find no comfort here. A machine taken as a friend demeans what we mean by friendship. Whom we like, who likes us—these things make us who we are. When Madison felt joyful in Kismet’s “affection,” I could not be glad. I felt in the shadow of an experiment, just beginning, in which humans are the subjects.
Even now, our excitement about the possibilities for robot/human interaction moves us to play fast and loose with our emotions. In one published experiment, two young children are asked to spend time with a man and a robot designed to be his clone.
14
The experiment has a significant backstory. Japanese roboticist Hiroshi Ishiguro built androids that duplicate himself, his wife, and his five-year-old daughter. The daughter’s first reaction when she saw her android clone was to flee. She refused to go near it and would no longer visit her father’s laboratory. Years later, when the daughter was ten, a group of psychologists designed a study in which this girl and a four-year-old boy (a child of one of the researchers) were asked to interact with both Ishiguro and his android double. Both children begin the study reluctant to interact with the android. Then, both (by measures such as “makes eye contact” and “speaks”) become willing to engage almost equally with the man and with the robot. Ishiguro’s daughter is finally able to sit in a room alone with her father’s android clone. It is hard to know how to comment on this narrative of a frightened child who makes ever-fainter objections to her part in this experiment. It seems to have little in it that is positive. Yet, the authors use this narrative as evidence of success: children will be open to humanlike robots as teachers, babysitters, and companions. But what could it mean to this child to sit with her father’s machine double? What could she want from it? Why does it matter that she is finally willing to make eye contact and speak with it? Why would we want her to? It is easy to become so immersed in technology that we ignore what we know about life.
CHAPTER 6
 
Love’s labor lost
 
W
hen Takanori Shibata took the floor at a spring 2009 meeting at MIT’s V AgeLab, he looked trium p hant. The daylong conference centered on robots for the elderly, and Shibata, inventor of the small, seal-like sociable robot Paro, was the guest of honor. The AgeLab’s mission is to create technologies for helping the elderly with their physical and emotional needs, and already Paro had carved out a major role on this terrain. Honored by Guinness Records as “the most therapeutic robot in the world” in 2002, Paro had been front and center in Japan’s initiative to use robots to support senior citizens.
1
Now Shibata proudly announced that Denmark had just placed an order for one thousand Paros for its elder-care facilities. The AgeLab gathering marked the beginning of its American launch.
Shibata showed a series of videos: smiling elderly men and women in Japanese nursing homes welcoming the little furry “creature” into their arms; seniors living at home speaking appreciatively about the warmth and love that Paro brought them; agitated and anxious seniors calming down in Paro’s company.
2
The meeting buzzed with ideas about how best to facilitate Paro’s acceptance into American elder care. The assembled engineers, physicians, health administrators, and journalists joined in a lively, supportive discussion. They discussed what kind of classification Shibata should seek to facilitate Paro’s passage through the legendary scrutiny of the Food and Drug Administration.
I heard only one negative comment. A woman who identified herself as a nurse said that she and her colleagues had worked long and hard to move away from representing the elderly as childlike. To her, Paro seemed “a throwback, a new and fancier teddy bear.” She ended by saying that she believed nurses would resist the introduction of Paro and objects like it into nursing homes. I lowered my eyes. I had made a decision to attend this meeting as an observer, so I said nothing. At the time, I had been studying Paro in Massachusetts nursing homes for several years. Most often, nurses, attendants, and administrators had been happy for the distraction it provided. I was not at all sure that nurses would object to Paro.
In any case, the nurse’s concern was met with silence, something I have come to anticipate at such gatherings. In robotics, new “models” are rarely challenged. All eyes focus on technical virtuosity and the possibilities for efficient implementation. At the AgeLab, the group moved on to questions about Paro’s price, now set at some $6,000 a unit. Was this too high for something that might be received as a toy? Shibata thought not. Nursing homes were already showing willingness to pay for so valuable a resource. And Paro, he insisted, is not a toy. It reacts to how it is treated (is a touch soft or aggressive?) and spoken to (it understands about five hundred English words, more in Japanese). It has proved itself an object that calms the distraught and depressed. And Shibata claimed that unlike a toy, Paro is robust, ready for the rough-and-tumble of elder care. I bit my lip. At the time I had three broken Paros in my basement, casualties of my own nursing home studies. Why do we believe that the next technology we dream up will be the first to prove not only redemptive but indestructible?

Other books

14 Arctic Adventure by Willard Price
Then We Die by James Craig
Loamhedge by Brian Jacques
Where I Was From by Joan Didion
Jeremy Poldark by Winston Graham
Nanny Dearest by Shawn Bailey


readsbookonline.com Copyright 2016 - 2024