Read Alone Together Online

Authors: Sherry Turkle

Alone Together (4 page)

Always impressed with Levy’s inventiveness, I found myself underwhelmed by the message of this latest book,
Love and Sex with Robots
.
4
No tongue-in-cheek science fiction fantasy, it was reviewed without irony in the
New York Times
by a reporter who had just spent two weeks at the Massachusetts Institute of Technology (MIT) and wrote glowingly about its robotics culture as creating “new forms of life.”
5
Love and Sex
is earnest in its predictions about where people and robots will find themselves by mid-century: “Love with robots will be as normal as love with other humans, while the number of sexual acts and lovemaking positions commonly practiced between humans will be extended, as robots will teach more than is in all of the world’s published sex manuals combined.”
6
Levy argues that robots will teach us to be better friends and lovers because we will be able to practice on them. Beyond this, they will substitute where people fail. Levy proposes, among other things, the virtues of marriage to robots. He argues that robots are, of course, “other” but, in many ways, better. No cheating. No heartbreak. In Levy’s argument, there is one simple criterion for judging the worth of robots in even the most intimate domains: Does being with a robot make you feel better? The master of today’s computerspeak judges future robots by the impact of their behavior. And his next bet is that in a very few years, this is all we will care about as well.
I am a psychoanalytically trained psychologist. Both by temperament and profession, I place high value on relationships of intimacy and authenticity. Granting that an AI might develop its own origami of lovemaking positions, I am troubled by the idea of seeking intimacy with a machine that has no feelings, can have no feelings, and is really just a clever collection of “as if ” performances, behaving as if it cared, as if it understood us. Authenticity, for me, follows from the ability to put oneself in the place of another, to relate to the other because of a shared store of human experiences: we are born, have families, and know loss and the reality of death.
7
A robot, however sophisticated, is patently out of this loop.
So, I turned the pages of Levy’s book with a cool eye. What if a robot is not a “form of life” but a kind of performance art? What if “relating” to robots makes us feel “good” or “better” simply because we feel more in control? Feeling good is no golden rule. One can feel good for bad reasons. What if a robot companion makes us feel good but leaves us somehow diminished? The virtue of Levy’s bold position is that it forces reflection: What kinds of relationships with machines are possible, desirable, or ethical? What does it mean to love a robot? As I read
Love and Sex
, my feelings on these matters were clear. A love relationship involves coming to savor the surprises and the rough patches of looking at the world from another’s point of view, shaped by history, biology, trauma, and joy. Computers and robots do not have these experiences to share. We look at mass media and worry about our culture being intellectually “dumbed down.”
Love and Sex
seems to celebrate an emotional dumbing down, a willful turning away from the complexities of human partnerships—the inauthentic as a new aesthetic.
I was further discomforted as I read
Love and Sex
because Levy had interpreted my findings about the “holding power” of computers to argue his case. Indeed, Levy dedicated his book to Anthony,
b
an MIT computer hacker I interviewed in the early 1980s. Anthony was nineteen when I met him, a shy young man who found computers reassuring. He felt insecure in the world of people with its emotional risks and shades of gray. The activity and interactivity of computer programming gave Anthony—lonely, yet afraid of intimacy—the feeling that he was not alone.
8
In
Love and Sex
, Levy idealizes Anthony’s accommodation and suggests that loving a robot would be a reasonable next step for people like him. I was sent an advance copy of the book, and Levy asked if I could get a copy to Anthony, thinking he would be flattered. I was less sure. I didn’t remember Anthony as being at peace with his retreat to what he called “the machine world.” I remembered him as wistful, feeling himself a spectator of the human world, like a kid with his nose to the window of a candy store. When we imagine robots as our future companions, we all put our noses to that same window.
I was deep in the irony of my unhappy Anthony as a role model for intimacy with robots when the
Scientific American
reporter called. I was not shy about my lack of enthusiasm for Levy’s ideas and suggested that the very fact we were discussing marriage to robots at all was a comment on human disappointments—that in matters of love and sex, we must be failing each other. I did not see marriage to a machine as a welcome evolution in human relationships. And so I was taken aback when the reporter suggested that I was no better than bigots who deny gays and lesbians the right to marry. I tried to explain that just because I didn’t think people should marry machines didn’t mean that any mix of adult people wasn’t fair territory. He accused me of species chauvinism: Wasn’t I withholding from robots their right to “realness”? Why was I presuming that a relationship with a robot lacked authenticity? For me, the story of computers and the evocation of life had come to a new place.
At that point, I told the reporter that I, too, was taking notes on our conversation. The reporter’s point of view was now data for my own work on our shifting cultural expectations of technology—data, that is, for the book you are reading. His analogizing of robots to gay men and women demonstrated that, for him, future intimacy with machines would not be a second-best substitute for finding a person to love. More than this, the reporter was insisting that machines would bring their own special qualities to an intimate partnership that needed to be honored in its own right. In his eyes, the love, sex, and marriage robot was not merely “better than nothing,” a substitute. Rather, a robot had become “better than something.” The machine could be preferable—for any number of reasons—to what we currently experience in the sometimes messy, often frustrating, and always complex world of people.
This episode with the
Scientific American
reporter shook me—perhaps in part because the magazine had been for me, since childhood, a gold standard in scientific publication. But the extravagance of the reporter’s hopes for robots fell into a pattern I had been observing for nearly a decade. The encounter over
Love and Sex
most reminded me of another time, two years before, when I met a female graduate student at a large psychology conference in New Orleans; she had taken me aside to ask about the current state of research on robots designed to serve as human companions. At the conference, I had given a presentation on
anthropomorphism
—on how we see robots as close to human if they do such things as make eye contact, track our motion, and gesture in a show of friendship. These appear to be “Darwinian buttons” that cause people to imagine that the robot is an “other,” that there is, colloquially speaking, “somebody home.”
During a session break, the graduate student, Anne, a lovely, raven-haired woman in her mid-twenties, wanted specifics. She confided that she would trade in her boyfriend “for a sophisticated Japanese robot” if the robot would produce what she called “caring behavior.” She told me that she relied on a “feeling of civility in the house.” She did not want to be alone. She said, “If the robot could provide the environment, I would be happy to help produce the illusion that there is somebody really with me.” She was looking for a “no-risk relationship” that would stave off loneliness. A responsive robot, even one just exhibiting scripted behavior, seemed better to her than a demanding boyfriend. I asked her, gently, if she was joking. She told me she was not. An even more poignant encounter was with Miriam, a seventy-two-year-old woman living in a suburban Boston nursing home, a participant in one of my studies of robots and the elderly.
I meet Miriam in an office that has been set aside for my interviews. She is a slight figure in a teal blue silk blouse and slim black pants, her long gray hair parted down the middle and tied behind her head in a low bun. Although elegant and composed, she is sad. In part, this is because of her circumstances. For someone who was once among Boston’s best-known interior designers, the nursing home is a stark and lonely place. But there is also something immediate: Miriam’s son has recently broken off his relationship with her. He has a job and family on the West Coast, and when he visits, he and his mother quarrel—he feels she wants more from him than he can give. Now Miriam sits quietly, stroking Paro, a sociable robot in the shape of a baby harp seal. Paro, developed in Japan, has been advertised as the first “therapeutic robot” for its ostensibly positive effects on the ill, elderly, and emotionally troubled. Paro can make eye contact by sensing the direction of a human voice, is sensitive to touch, and has a small working English vocabulary for “understanding” its users (the robot’s Japanese vocabulary is larger); most importantly, it has “states of mind” affected by how it is treated. For example, it can sense whether it is being stroked gently or with aggression. Now, with Paro, Miriam is lost in her reverie, patting down the robot’s soft fur with care. On this day, she is particularly depressed and believes that the robot is depressed as well. She turns to Paro, strokes him again, and says, “Yes, you’re sad, aren’t you? It’s tough out there. Yes, it’s hard.” Miriam’s tender touch triggers a warm response in Paro: it turns its head toward her and purrs approvingly. Encouraged, Miriam shows yet more affection for the little robot. In attempting to provide the comfort she believes it needs, she comforts herself.
Because of my training as a clinician, I believe that this kind of moment, if it happens between people, has profound therapeutic potential. We can heal ourselves by giving others what we most need. But what are we to make of this transaction between a depressed woman and a robot? When I talk to colleagues and friends about such encounters—for Miriam’s story is not unusual—their first associations are usually to their pets and the solace they provide. I hear stories of how pets “know” when their owners are unhappy and need comfort. The comparison with pets sharpens the question of what it means to have a relationship with a robot. I do not know whether a pet could sense Miriam’s unhappiness, her feelings of loss. I do know that in the moment of apparent connection between Miriam and her Paro, a moment that comforted her, the robot understood nothing. Miriam experienced an intimacy with another, but she was in fact alone. Her son had left her, and as she looked to the robot, I felt that we had abandoned her as well.
Experiences such as these—with the idea of aliveness on a “need-to-know” basis, with the proposal and defense of marriage to robots, with a young woman dreaming of a robot lover, and with Miriam and her Paro—have caused me to think of our time as the “robotic moment.” This does not mean that companionate robots are common among us; it refers to our state of emotional—and I would say philosophical—readiness. I find people willing to seriously consider robots not only as pets but as potential friends, confidants, and even romantic partners. We don’t seem to care what these artificial intelligences “know” or “understand” of the human moments we might “share” with them. At the robotic moment, the performance of connection seems connection enough. We are poised to attach to the inanimate without prejudice. The phrase “technological promiscuity” comes to mind.
As I listen for what stands behind this moment, I hear a certain fatigue with the difficulties of life with people. We insert robots into every narrative of human frailty. People make too many demands; robot demands would be of a more manageable sort. People disappoint; robots will not. When people talk about relationships with robots, they talk about cheating husbands, wives who fake orgasms, and children who take drugs. They talk about how hard it is to understand family and friends. I am at first surprised by these comments. Their clear intent is to bring people down a notch. A forty-four-year-old woman says, “After all, we never know how another person really feels. People put on a good face. Robots would be safer.” A thirty-year-old man remarks, “I’d rather talk to a robot. Friends can be exhausting. The robot will always be there for me. And whenever I’m done, I can walk away.”
The idea of sociable robots suggests that we might navigate intimacy by skirting it. People seem comforted by the belief that if we alienate or fail each other, robots will be there, programmed to provide simulations of love.
9
Our population is aging; there will be robots to take care of us. Our children are neglected; robots will tend to them. We are too exhausted to deal with each other in adversity ; robots will have the energy. Robots won’t be judgmental. We will be accommodated. An older woman says of her robot dog, “It is better than a real dog. . . . It won’t do dangerous things, and it won’t betray you. . . . Also, it won’t die suddenly and abandon you and make you very sad.”
10
The elderly are the first to have companionate robots aggressively marketed to them, but young people also see the merits of robotic companionship. These days, teenagers have sexual adulthood thrust upon them before they are ready to deal with the complexities of relationships. They are drawn to the comfort of connection without the demands of intimacy. This may lead them to a hookup—sex without commitment or even caring. Or it may lead to an online romance—companionship that can always be interrupted. Not surprisingly, teenagers are drawn to love stories in which full intimacy cannot occur—here I think of current passions for films and novels about high school vampires who cannot sexually consummate relationships for fear of hurting those they love. And teenagers are drawn to the idea of technological communion. They talk easily of robots that would be safe and predictable companions.
11

Other books

The Forgiving Hour by Robin Lee Hatcher
Reinventing Mona by Jennifer Coburn
The Ying on Triad by Kent Conwell
Bad Love by Jonathan Kellerman
Whispering Back by Adam Goodfellow
Teasing The Boss by Mallory Crowe
Death in Albert Park by Bruce, Leo
Betrayal by Danielle Steel


readsbookonline.com Copyright 2016 - 2024