Read Alone Together Online

Authors: Sherry Turkle

Alone Together (26 page)

And this is where we are in the robotic moment. One of the world’s most sophisticated robot “users” cannot resist the idea that pressure from a robot’s hand implies caring. If we are honest with ourselves about what machines care about, we must accept their ultimate indifference. And yet, a hand that reaches for ours says, “I need you. Take care of me. Attend to me. And then, perhaps, I will—and will
want
to—attend to you.” Again, what robots offer meets our human vulnerabilities. We can interact with robots in full knowledge of their limitations, comforted nonetheless by what must be an unrequited love.
A MOMENT OF MORE: MERGING MIND AND BODY
 
In the fall of 2005, performance artist Pia Lindman came to MIT with communion on her mind. Lindman had an artistic vision: she would find ways to merge her face and body with MIT’s sociable robots. She hoped that by trying, she would come to know their minds. For Lindman, the robots were what Emerson would have called “test objects.” She imagined that immersion in a robot’s nature might give her a new understanding of her own.
The MIT sociable robots are inspired by a philosophical tradition that sees mind and body as inseparable. Following Immanuel Kant, Martin Heidegger, Maurice Merleau-Ponty, and, more recently, Hubert Dreyfus and Antonio Damasio, this tradition argues that our bodies are quite literally instruments of thought; therefore, any computer that wants to be intelligent had better start out with one.
6
Not all schools of artificial intelligence have been sympathetic to this way of seeing things. One branch of the field, often referred to as “symbolic AI,” associates itself with a Cartesian mind/body dualism and argues that machine intelligence can be programmed through rules and the representation of facts.
7
In the 1960s, philosopher Hubert Dreyfus took on the symbolic AI community when he argued that “computers need bodies in order to be intelligent.”
8
This position has a corollary; whatever intelligence machines may achieve, it will never be the kind that people have because no body given to a machine will be a human body. Therefore, the machine’s intelligence, no matter how interesting, will be alien.
9
Neuroscientist Antonio Damasio takes up this argument from a different research tradition. For Damasio, all thinking and all emotion is embodied. The absence of emotion reduces the scope of rationality because we literally think with our feelings, thus the rebuking title of his 1994 book
Descartes’ Error
.
10
Damasio insists that there is no mind/body dualism, no split between thought and feeling. When we have to make a decision, brain processes that are shaped by our body guide our reasoning by remembering our pleasures and pains. This can be taken as an argument for why robots will never have a humanlike intelligence: they have neither bodily feelings nor feelings of emotion. These days, roboticists such as Brooks take up that challenge. They grant that intelligence may indeed require bodies and even emotions, but insist that they don’t have to be human ones. And in 2005, it was Brooks to whom Lindman applied when she wanted to join her mind and body to a machine.
A precursor to Lindman’s work with robots was her 2004 project on grief. She chose photographs of people grieving from the
New York Time
s—a mother bending over a dead child, a husband learning he has lost his wife to a terrorist attack. Then, she sketched several hundred of the photographs and began to act them out, putting her face and body into the positions of the people in the photographs. Lindman says she felt grief as she enacted it. Biology makes this so. The shape of a smile or frown releases chemicals that affect mental state.
11
And in humans, “mirror neurons” fire both when we observe others acting and when we act ourselves. Our bodies find a way to implicate us emotionally in what we see.
12
Lindman came out of the grief project wanting to further explore the connection between embodiment and emotion. So, closely tracking that project’s methodology, she began to work with machines that had bodies. Teaming up with Edsinger, she videotaped his interactions with Domo, sketched the interactions of man and robot, and then learned to put herself in the place of both.
13
Her enactments included Edsinger’s surprise at being surprised when Domo does something unexpected; his pleasure when he holds down the robot’s hand in order to get things done, and Domo, responding, seems to want freedom; his thrill in the moment when Domo finishes its work and looks around for the last place it saw a human, the place that Edsinger occupies. Through communion with man and robot, Lindman hoped to experience the gap between the human and the machine. In the end, Lindman created a work of art that both addresses and skirts the question of desire.
At an MIT gallery in the spring of 2006, Lindman performed the results of her work with Edsinger and Domo. On the walls she mounted thirty-four drawings of herself and the robot. In some drawings, Lindman assumes Domo’s expression when disengaged, and she looks like a machine; in others, Domo is caught in moments of intense “engagement,” and it looks like a person. In the drawings, Domo and Lindman seem equally comfortable in the role of person or machine, comfortable being each other.
The performance itself began with a video of Edsinger and Domo working together. They interact with an elegant economy of gesture. These two know each other very well. They seem to anticipate each other, look after each other. The video was followed by Lindman “enacting” Domo on a raised stage. She was dressed in gray overalls, her hair pulled into a tight bun. Within a few minutes, I forgot the woman and saw the machine. And then Lindman played both parts: human and machine. This time, within minutes, I saw two humans. And then, figure turned to ground, and I saw two machines, two very fond machines. Or was it two machines that were perhaps too fond? I was with a colleague who saw it the other way, first two machines and then two humans. Either way, Lindman had made her point: the boundaries between people and things are shifting. What of these boundaries is worth maintaining?
Later, I meet privately with Lindman, and she talks about her performance and her experience making the film. “I turn myself into the human version of Domo . . . and I feel the connection between [Edsinger] and Domo. . . . You feel the tenderness, the affection in their gestures. Their pleasure in being together.” She dwells on a sequence in which Edsinger tries to get Domo to pick up a ball. At one moment, the ball is not in Domo’s field of vision. The robot looks toward Edsinger, as though orienting to a person who can help, a person whom it trusts. It reaches for Edsinger’s hands. For the robot, says Lindman, “there is information to be gathered through touch.” Domo and Edsinger stare at each other, with Domo’s hands on Edsinger’s as though in supplication. Lindman says that in enacting Domo for this sequence, she “couldn’t think about seeking the ball.... I’ve always thought about it as a romantic scene.”
For Lindman this scene is crucial. In trying to play a robot, she found that the only way to get it right was to use a script that involved love. “The only way I was able to start memorizing the movements was to create a narrative. To put emotions into the movements made me remember the movements.” She is aware that Edsinger had a different experience. He had moments when he saw the robot as both program and creature: “A lot of times he’d be looking at the screen with the code scrolling by. . . . He is looking at the robot’s behavior, at its internal processes, but also is drawn into what is compelling in the physical interaction.” Edsinger wrote Domo’s code, but also learns from touching Domo’s body. Watching these moments on film, I see the solicitous touch of a mother who puts her hand on her child’s forehead to check for fever.
Of a scene in which Edsinger holds down Domo’s hand to prevent a collision, Lindman says,
[Edsinger] is holding Domo’s hand like this [Lindman demonstrates by putting one hand over another] and looks into Domo’s eyes to understand what it’s doing: Where are its eyes going? Is it confused? Is it trying to understand what it’s seeing or is it understanding what it’s seeing? To get eye contact with Domo is, like, a key thing. And he gets it. He’s actually looking at Domo trying to understand what it’s looking at, and then Domo slowly turns his head and looks him in the eye. And it’s this totally romantic moment.
 
Edsinger, too, has described this moment as one in which he feels the pleasure of being sought after. So, it is not surprising that to enact it, Lindman imagined robot and man in a moment of desire. She says, “It is as though I needed the robot to
seem
to have emotions in order to understand it.” She is able to play Domo only if she plays a woman desiring a man. “It is,” she admits, “the scene I do best.”
In the grief project, the position of her body brought Lindman to experiences of abjection, something that she now attributes to mirror neurons. She had expected that doubling for a robot would be very different because “it has no emotion.” But in the end, she had to create emotions to become an object without emotion. “To remember the robot’s motions, I had to say: ‘It does this because it feels this way.’ . . . It wasn’t like I was feeling it, but I had to have that logic.” Except that (think of the mirror neurons) Lindman
was
feeling it. And despite herself, she couldn’t help but imagine them in the machine. Lindman’s account becomes increasingly complex as she grapples with her experience. If the subject is communion with the inanimate, these are the telling contradictions of an expert witness.
14
The philosopher Emmanuel Lévinas writes that the presence of a face initiates the human ethical compact.
15
The face communicates, “Thou shalt not kill me.” We are bound by the face even before we know what stands behind it, even before we might learn that it is the face of a machine. The robotic face signals the presence of a self that can recognize another. It puts us in a landscape where we seek recognition. This is not about a robot’s being able to recognize us. It is about our desire to have it do so.
Lindman could not play Edsinger without imagining him wanting the robot’s recognition; she could not play Domo without imagining it wanting Edsinger’s recognition. So, Lindman’s enactment of Domo looking for a green ball interprets the robot as confused, seeking the person closest to it, locking eyes, and taking the person’s hand to feel comforted. It is a moment, classically, during which a person might experience a feeling of communion. Edsinger—not just in Lindman’s recreation—feels this closeness, unswayed by his knowledge of the mechanisms behind the robot’s actions. For Lindman, such interactions spark “a crisis about what is authentic and real emotion.”
Lindman worries that the romantic scripts she uses “might not seem to us authentic” because robots “are of mechanism not spirit.” In her grief project, however, she found that grief is always expressed in a set of structured patterns, programmed, she thinks, by biology and culture. So we, like the robots, have programs beneath our expression of feelings. We are constrained by mechanisms, even in our most emotional moments. And if our emotions are mediated by such programming, asks Lindman, how different are our emotions from those of a machine? For Lindman, the boundary is disappearing. We are authentic in the way a machine can be, and a machine can be authentic in the way a person can be.
And this is where I began. The questions for the future are not whether children will love their robot companions more than their pets or even their parents. The questions are rather, What will love be? And what will it mean to achieve ever-greater intimacy with our machines? Are we ready to see ourselves in the mirror of the machine and to see love as our performances of love?
In her enactments of grief, Lindman felt her body produce a state of mind. And in much the same spirit, when she enacts Domo, she says she “feels” the robot’s mind. But Lindman is open to a more transgressive experience of the robot mind. After completing the Domo project, she begins to explore how she might physically connect her face to the computer that controls the robot Mertz.
Lijin Aryananda’s Mertz, a metal head on a flexible neck, improves on Kismet’s face, speech, and vision. Like Kismet, Mertz has expressive brows above its black ping-pong ball eyes—features designed to make a human feel kindly toward the robot. But this robot can actually speak simple English. Like Domo, Mertz has been designed as a step toward a household companion and helper. Over time, and on its own, it is able to recognize a set of familiar individuals and chat with them using speech with appropriate emotional cadence. Lindman hopes that if she can somehow “plug herself ” into Mertz, she will have a direct experience of its inner state. “I will experience its feelings,” she says excitedly. And Lindman wants to have her brain scanned while she is hooked up to Mertz in order to compare images of her brain activity to what we know is going on in the machine. “We can actually look at both,” she says. “I will be the embodiment of the AI and we will see if [when the robot smiles], my brain is smiling.”
Lindman soon discovers that a person cannot make her brain into the output device for a robot intelligence. So, she modifies her plan. Her new goal is to “wear” Mertz’s facial expressions by hooking up her face rather than her brain to the Mertz computer, to “become the tool for the expression of the artificial intelligence.” After working with Domo, Lindman anticipates that she will experience a gap between who she is and what she will feel as she tries to be the robot. She hopes the experiment will help her understand what is specific to her as a human. In that sense, the project is about yearning for communion with the machine as well as inquiring into whether communion is possible. Lindman imagines the gap: “You will say, ‘Okay, so there’s the human.’”
16

Other books

Horrid Henry Rocks by Francesca Simon
Second Stage Lensman by E. E. (Doc) Smith
Captive by Fawcett, K. M.
Who's the Boss by Vanessa Devereaux
Fire Kin by M.J. Scott


readsbookonline.com Copyright 2016 - 2024