Read Alone Together Online

Authors: Sherry Turkle

Alone Together (25 page)

Rich is wearing a watch that Carol recently bought for him, and he shows it off to Kismet and asks for an opinion. Rich admits that the week before, he almost lost the watch.
Rich: I want to show you something. This is a watch that my . . . this is a watch that my girlfriend gave me.
Kismet: [babbles with interest and encouragement; looks down to the watch]
Rich: Yeah, look, it’s got a little blue light in it too.... You like it? I almost lost it this week.
 
When Kismet’s reaction to all of this girlfriend talk is to sound shy, deferent, and sympathetic, Rich seems to play with the notion that this robot could be an interested party. He’s enjoying himself. And when the robot responds a bit out of turn and in a low come-hither tone, Rich loses his footing and abandons himself to their exchange. His interaction with Kismet becomes decidedly flirtatious.
1
Kismet can mimic human prosody, so when Rich becomes intimate in his tone, so does the robot. The two could easily be at a cocktail party or at a bar.
2
Rich: Do you know what it’s like to lose something?
Kismet: [nods with assent; sounds warm in its interest]
Rich: You
are
amazing.
 
At this point, Kismet, appreciatively repeats something close to the word “amazing.” Rich, smitten, now seems to operate within an inchoate fantasy that he might want something from this robot; there is something here for
him
. During their exchanges, when Kismet glances away from him, Rich moves to the side and gestures to the robot to follow him. At one point the robot talks over him and Rich says, “No, stop. No, no, no stop. Listen to me. Listen to me. I think we have something going. I think there’s something here between us.”
Indeed, something is going on between them. As Rich tries to leave, Kismet will not be put off and holds Rich back with a persuasive purr. Rich flirts back and tries to catch Kismet’s gaze. Successful, Kismet’s eyes now follow Rich. When Kismet lowers its eyes, suddenly “shy,” Rich does not want to let go. We are at a moment of more. Who is leading and who is following in this dance? As in a moment of romantic encounter, one loses track and discovers a new rhythm where it doesn’t matter; each animates and reanimates the other. Rich senses that he has lost control in a way that pleases him. He steps in with a raised finger to mark the moment:
Rich: Stop, you’ve got to let me talk. Shh, shh, shh . . .
Kismet: [sounds happy, one might say giggly, flattered]
Rich: Kismet, I think we’ve got something going on here. You and me . . . you’re amazing.
 
Rich, dazzled, asks again, “What
are
you?” Parting comes next—but not easily. There is an atmosphere of sweet sorrow, equally distributed.
Rich: Bye [regretful].
Kismet: [purrs in a warm tone]
Rich: Bye [in a softer, lower tone].
Kismet: [makes low “intimate” sounds]
Rich: Okay . . . all right.
 
Finally, Rich gives up. He is not leaving. He says to Kismet, “You know what? Hang on a second. I still want to talk to you; I’ve got a couple of things I want to say to you.” The video ends with Rich staring at Kismet, lost in his moment of more.
In this encounter we see how complicity gratifies by offering a fantasy of near communion. As our relationships with robots intensify, we move from wonder at what we have made to the idea that we have made something that will care for us and, beyond that, be fond of us. And then, there is something else: a wish to come ever closer to our creations—to be somehow enlivened by them. A robotic body meets our physicality with its own. A robot’s gaze, face, and voice allow us to imagine a meeting of the minds.
A MOMENT OF MORE: THE DANCER AND THE DANCE
 
In our studies, children imagined that Cog and Kismet were alive enough to evolve. In one common fantasy, they would have offspring with Cog’s body and Kismet’s face. Only a few years later, Cog and Kismet have direct heirs, new robots built by graduate students who were junior members of the Cog and Kismet teams. One of them is Domo, designed by Aaron Edsinger. It has a vastly improved version of Kismet’s face, speech, and vision—this robot really can have a conversation—and a vastly improved version of Cog’s body. Domo makes eye contact, shows expression, and follows human motion. Its grasp has a humanlike resistance. Cog mirrored human motion, but Domo knows how to collaborate.
Domo is designed to provide simple household help for the elderly or disabled.
3
I visit the robot on a day when Edsinger is “teaching” it to perform simple actions: to recognize objects, throw a ball, shelve groceries. But as is the case with all the MIT sociable robots, when one spends time with Domo, its effects transcend such down-to-earth intentions. Even technically sophisticated visitors describe a moment when Domo seems hesitant to release their hand. This moment could be experienced as unpleasant or even frightening—as contact with a robot out of control. Instead, people are more likely to describe it as thrilling. One feels the robot’s attention; more than this, one senses the robot’s desire. And then, of course, one lectures oneself that the robot has none.
For Edsinger, this sequence—experiencing Domo as having desires and then talking himself out of the idea—becomes familiar. For even though he is Domo’s programmer, the robot’s behavior has not become dull or predictable. Working together, Edsinger and Domo appear to be learning from each other. When Edsinger teaches Domo to hand him a ball or put an object into a cup, their simple actions read as an intimate ballet. They seem to be getting closer.
Edsinger extends his hand and asks for a ball. “Domo, give it,” he says softly. Domo picks up a ball and makes eye contact. “Give it,” the robot says and gently puts the ball in Edsinger’s hand. Edsinger asks Domo to place a carton of milk on a shelf: “Domo, shelf.” Domo repeats the instructions and complies. Edsinger asks, “How are things going, Domo?” Domo says, “Okay,” as he follows new instructions to shelve a bag of ground coffee and moves on to pouring salad dressing into a cup. “Domo, give it,” says Edsinger, and Domo hands Edsinger the salad dressing.
Just as the children crowded around Cog to attach toys to its arms, shoulders, and back, seeking physical involvement, Edsinger works close to his robot and admits he enjoys it:
Having physical contact—being in the robot space—it’s a very rich interaction when you are really, really engaged with it like that. Once Domo is trying to reach for a ball that I’m holding and something is wrong with his control. The arms are kind of pushing out and I’m grabbing the arms and pushing them down and it’s like a kid trying to get out of something; I feel physically coupled with Domo—in a way very different from what you could ever have with a face on a screen.... You definitely have the sense that it wants this thing and you’re trying to keep it from doing what it wants. It’s like a stubborn child. The frustration—you push the arm down and it stops and it tries again.... It takes on a very stubborn child quality. I’ve worked on Kismet. I’ve worked on Cog. All these other robots . . . none of them really have that sort of physical relationship.
 
Edsinger notes that people quickly learn how to work with Domo in a way that makes it easier for the robot to perform as desired. He reminds me that when we share tasks with other people, we don’t try trick each other up—say, by handing each other cereal boxes at funny angles. We try to be easy on each other. We do the same with Domo. “People,” says Edsinger, “are very perceptive about the limitations of the person they’re working with or the robot they’re working with . . . and so if they understand that Domo can’t quite do something, they will adapt very readily to that and try and assist it. So robots can be fairly dumb and still do a lot if they’re working with a person because the person can help them out.”
As Domo’s programmer, Edsinger explicitly exploits the familiar ELIZA effect, that desire to cover for a robot in order to make it seem more competent than it actually is. In thinking about Kismet and Cog, I spoke of this desire as complicity. Edsinger thinks of it as getting Domo to do more “by leveraging the people.” Domo needs the help. It understands very little about any task as a whole. Edsinger says, “To understand something subtle about a person’s intent, it’s really going to be hard to put that in the robot.” What Domo can do, says Edsinger, is “keep track of where a person is and ask, ‘Am I looking at a person reaching in the direction of my gaze?’—stuff like that. There’s no model of the person.” And yet, Edsinger himself says he experiences Domo as almost alive—almost uncomfortably so. For him, much of this effect comes from being with Domo as it runs autonomously for long periods—say, a half hour at a time—rather than being constrained, as he was on earlier projects, to try out elements of a robot’s program in thirty-second intervals. “I can work with Domo for a half hour and never do the exact same thing twice,” he says.
4
If this were said about a person, that would be a dull individual indeed. But by robotic standards, a seemingly unprogrammed half hour enchants.
Over a half hour, says Edsinger, Domo “moves from being this thing that you flip on and off and test a little bit of to something that’s running all the time.... You transition out of the machine thing to thinking of it as not so much a creature but as much more fluid in terms of being . . . [long hesitation] Well, you start to think of it as a creature, but this is part of what makes the research inherently uncomfortable. I enjoy that. That’s part of the reason I like building robots.”
Thrilled by moments when the “creature” seems to escape, unbidden, from the machine, Edsinger begins to think of Domo’s preferences not as things he has programmed but as the robot’s own likes and dislikes.
5
He says,
For me, when it starts to get complicated . . . sometimes I know that the robot is not doing things of its own “volition” because these are behaviors, well, I literally put them in there. But every now and then . . . the coordination of its behaviors is rich enough . . . well, it is of its own volition . . . and it catches you off guard. And to me this is what makes it fun . . . and it happens to me more and more now that I have more stuff running on it. . . .
If it doesn’t know what to do, it will look around and find a person. And if it can’t find a person, it looks to the last place [it] saw a person. So, I’ll be watching it do something, and it will finish, and it will look up at me as if to say, “I’m done; [I want your] approval.”
 
In these moments, there is no deception. Edsinger knows how Domo “works.” Edsinger experiences a connection where knowledge does not interfere with wonder. This is the intimacy presaged by the children for whom Cog was demystified but who wanted it to love them all the same.
Edsinger feels close to Domo as creature and machine. He believes that such feelings will sustain people as they learn to collaborate with robots. Astronauts and robots will go on space flights together. Soldiers and robots will go on missions together. Engineers and robots will maintain nuclear plants together. To be sold on partnership with robots, people need to feel more than comfortable with them. People should want to be around them. For Edsinger, this will follow naturally from the pleasure of physical contact with robotic partners. He says it is thrilling “just to experience something acting with some volition. There is an object, it is aware of my presence, it recognizes me, it wants to interact with me.”
Edsinger does not fall back on the argument that we need helper robots because there will not be enough people to care for each other in the future. For him, creating sociable robots is its own adventure. The robots of the future will be cute, want to hug, and want to help. They will work alongside people, aware of their presence and wishes. Edsinger admits that it will be “deceiving, if people feel the robots know more than they do or care more than they do.” But he does not see a moral issue. First, information about the robot’s limitations is public, out there for all the world to see. Second, we have already decided that it is acceptable to be comforted by creatures that may not really care for us: “We gain comfort from animals and pets, many of which have very limited understanding of us.” Why should we not embrace new relationships (with robots) with new limitations?
And besides, argues Edsinger, and this is an argument that has come up before, we take comfort in the presence of people whose true motivations we don’t know. We assign caring roles to people who may not care at all. This might happen when, during a hospitalization, a nurse takes our hand. How important is it that this nurse wants to hold our hand? What if this is a rote gesture, something close to being programmed? Is it important that this programmed nurse be a person? For Edsinger, it is not. “When Domo holds my hand,” he says, “it always feels good.... There is always that feeling of an entity making contact that it wants, that it needs. I like that, and I am willing to let myself feel that way . . . just the physical warm and fuzzy sense of being wanted, knowing full well that it is not caring.” I ask Edsinger to clarify. Is it pleasurable to be touched even if he knows that the robot doesn’t “want” to touch him. Edsinger is sure of his answer: “Yes.” But a heartbeat later he retracts it: “Well, there is a part of me that is trying to say, well, Domo cares.”

Other books

The Science of Herself by Karen Joy Fowler
Traitor to the Crown by C.C. Finlay
Sixteen and Dying by Lurlene McDaniel
Knight Shift by Paulette Miller
Martin Millar - Lonely Werewolf Girl by Lonely Werewolf Girl
Play Me Hard by Tracy Wolff
Smut in the City (Absolute Erotica) by Blisse, Victoria, Jones, Viva, Felthouse, Lucy, Marsden, Sommer, Renarde, Giselle, Dean, Cassandra, Flowers, Tamsin, Chaucer, Geoffrey, Zwaduk, Wendi, Bay, Lexie
Zola's Pride by Moira Rogers


readsbookonline.com Copyright 2016 - 2024