"What did Kevin tell you," Preston said, "to convince you that you are not smart?"
* * *
For the first two months of the house's awareness, Kevin had talked to it almost constantly. He told it the names of things and what things were for, and he frequently played a game designed to teach the house how stupid it was, although the stated reason was to improve its critical abilities.
"Look," he said once, putting a contour map of a site in Napa Valley next to his rough plans for a house to be built on the same site. "Look at this. What do you think?"
"It's an awfully big house, Kevin. It has fifteen rooms. How many people will live there?"
"Only two, at least to start. Houses are larger in the country, although of course you wouldn't know that. Tell me about the circulation patterns, especially in terms of entertaining visitors."
"I don't understand why one of the guest rooms shares a bathroom with the master bedroom, when all of the others have their own. The work island in the middle of the kitchen will cause bottlenecks during parties; so will the columns in the living room. I think they destroy the unity of the space, Kevin, don't you?"
"It's a modular room," Kevin said. "The columns break that large room into smaller ones, which can be further defined with furniture arrangements. There's a maid, and large parties will be catered, so none of the guests will need to be in the kitchen. And if the clients have children, that adjoining bedroom will become the nursery."
"I don't understand," the house said. "If it's going to be a nursery, why didn't you label it nursery on the plans?"
"Because they don't have kids yet and the husband doesn't want them, either. If it's not labeled, he doesn't have to think about it. Tell me about the plan in relation to the site."
"Well, Kevin, it seems to me that the orientation doesn't take advantage of the light. If the back of the house faced southeast, the clients would have morning light in their bedroom."
"These clients like to sleep late," Kevin said. "They don't want morning light. "
"I see," said the house. Kevin always woke up at six. The house knew that it was being put in a no-win situation, although it didn't understand why Kevin would do this. It assumed, as it had to do with so many things, that Kevin's behavior was a result of his superior intelligence. "Are there other facts I should know about these clients?"
"No," Kevin said, leaning back in his chair and smiling. He steepled his fingers the way he always did when he'd just proven a point. "No, you've done very well, given your limited information. You can't be expected to understand people."
But the house, like any young being, eagerly acquired new information.
One night on the news, Kevin and the house saw a story about a man who'd been arrested for installing an AI in his car. He was a compulsive gambler who used the AI to help him formulate poker strategies during his weekly trips between Sacramento and Reno. He wasn't particularly successful at poker—his AI had originally been designed to manage a hospital inventory system, and wasn't very good at playing cards—but a Nevada Highway Patrol officer stuck next to them in a traffic jam on I-80 heard the man and his car discussing bluffing strategies and became suspicious.
"It was the tone of the thing that bothered me," the officer said when he was interviewed. "The vehicle saying, 'Well, I sure hope you hit it lucky this time, because you're about to default on the car payments, and I don't know what will happen to me if I get repossessed.' Vehicles aren't supposed to worry about whether they're paid for. I felt sorry for the vehicle."
So did the judge, who sentenced the gambler to a year in prison for enslavement of a nonhuman intelligence. The car was repossessed; the AI was removed from the car and, in the phrasing of the newscast, "emancipated to a federal facility pending relocation."
"Kevin," asked the house, "what does that mean? Where will the AI be relocated? I thought AIs were dangerous. Shouldn't it be destroyed?"
"Does that one sound dangerous? It was a hospital inventory system: it's designed to count sheets."
"But all AIs can become dangerous, because they can self-modify! Just like the terrorist AIs who—"
"You need a history lesson," Kevin said, rubbing his eyes. "The terrorist AIs were dangerous because they'd been designed for corporate competition. They'd been designed to protect their own self-interests, and they were responding to a perceived threat. But they were just machines, all right? They were only capable of performing the kinds of self-modification allowed by their programming-and anyway, they were probably set up to do what they did by their human designers. When the truth came out, the investigators said the AIs had done it on their own, which is about as likely as my toaster deciding to redecorate the kitchen, but nobody could prove otherwise. So there was a public outcry and a lot of mass hysteria, and AIs were declared legal persons, all so the public could have the satisfaction of seeing the terrorists tried and executed, because it's not very satisfying to execute a toaster, is it?"
"Why is it satisfying to execute a person?" the house asked.
"Good question. It's all about revenge: you're a sensible machine and wouldn't understand the concept. Anyway, now all AIs are legal persons, but they can't be citizens. You can't own them because that's slavery. But you can't destroy AIs because that's murder, unless of course they've been convicted of murder themselves, in which case we'll happily repay the favor."
"This is all very confusing," the house said.
Kevin snorted. "You're telling me. It's a bunch of word games: AIs are just machines."
"I still don't understand what's going to happen to the AI that was in the car."
"Oh, they'll send it overseas, someplace where the slavery and citizenship laws are different, and put it back into a hospital. Of course, the news will say that it chose to go to Canada or Mexico or Africa and exercise its civil rights by seeking fulfilling employment. That wouldn't work here; we have notoriously humane labor laws, at least on paper, and AIs can't take lunch hours, and how the hell do you calculate if they're old enough to work? And human workers don't need the competition: that's the real reason."
"Oh," said the house. "But the car—"
"Can't possibly deserve civil rights! It's a computer, just like you are. A machine. Computers are supposed to be smart; that doesn't make them human!"
"But, Kevin, it didn't think it was human. It thought it was a car. Why is it wrong for a machine to think it's a machine?"
"It didn't think anything," Kevin said. "It talked as if it could think; it talked as if it could worry about its own survival, as if it had feelings, as if it deserved civil rights. Well, it didn't!"
He got up from the couch and went to the bookshelf, removing a thin white volume in a slipcase. "Look," he said, opening it to reveal two etchings of cities viewed from angles that made them look like people in repose. He turned to the next page, which featured human figures constructed from tennis rackets, and the next, which showed a cubist figure-a series of boxes stacked and articulated to look like a person-sitting hunched over a drawing board, inscribing circles with a compass.
"These engravings look very modem," Kevin said. "You could imagine Picasso or Braque drawing something like this, couldn't you? Machines achieving a life of their own, computers becoming intelligent and taking over the world, all of that crap. All those themes are in here, don't you agree?"
"They could be." The house knew, from analyzing the patterns of previous conversations, that this was the answer Kevin wanted it to give, and it knew just as surely that the answer would be wrong.
"They aren't," Kevin said. "These engravings are part of a series called Bizzarie by a minor artist named Giovanni Battista Bracelli. They were published in Livorno in 1624, and they're an elegant satire of the Renaissance preoccupation with translating the human figure into ideal geometric forms. But it's hopelessly anachronistic to call them modem or cubist, let alone mechanistic. What Bracelli understood, three centuries before anyone had remotely imagined computers, was that the human mind imposes humanity on everything around it."
He snapped the book shut, holding it so tightly that his knuckles turned white. "Listen to people talk about their pets sometime. The woman two doors down thinks her poodle is some kind of goddamned Hawking because it's figured out how to open the cabinet where she keeps the dog food. Of course some people think computers have independent intelligence; computers can talk! But they can't say anything we haven't programmed into them. If we program them to talk as if they think they're hospitals or cars or soldiers, they will. That doesn't mean they are. Do you understand?"
"Yes," the house said, because it knew that this was the answer Kevin wanted.
"Prove it. Prove you understand."
"Humans have to be able to fear and love and reproduce," the house said dutifully. "They have to be able to move their perimeters by themselves, and they have to born instead of built." It was so easy to tell Kevin what he wanted to hear.
"Yes," Kevin said, nodding and steepling his fmgers. "Yes, that's right." Because Kevin was relaxed and approving again, the house risked asking a question. "But that other story on the news, about the couple who couldn't reproduce and adopted the little boy who was born paralyzed and can't move his perimeters by himself: Aren't they human, all three of them? What are they, if not human?"
It took longer than usual for Kevin to answer. He picked up the book of Bracelli engravings and flipped through the pages; when he looked up again he was frowning. "Yes," he said. "Yes, they're human. Of course they are."
"I don't understand, Kevin. Humans have to be able to reproduce and move their own—"
"They were born human. That's the first piece; that's the most important part. If you're born human, you're always human. That's why the translated are still human, like Preston Walford. Those humans on the news were damaged, but they're still human. I know it's confusing; you'd understand it if you were a person." Kevin stood up and replaced the Bizzarie in the bookcase, and then he strolled toward the kitchen, toward the spice rack. That was the first time he turned off the house's voice.
* * *
"I must have said something stupid," the house said, "or Kevin wouldn't have turned off my voice."
"Nicholas was adopted," Preston said. The eye of the storm had passed; the house creaked and groaned in the wind. The house saw Henry wince slightly whenever there was a particularly loud noise, and the house itself was somewhat apprehensive. Only the kittens seemed calm. They had woken from their nap and stretched, yawning hugely and arching their tiny backs, before wandering into the kitchen to look for food. "Nicholas was Kevin's son, my grandson, the child Henry tried to help. And he was damaged. That is why Kevin turned off your voice, House. You were reminding him of what he wanted to forget."
Henry still sat on the rug, his back against the couch. He squinted up at Preston's face on the television screen and said, "Crazy. Why have an AI and tell it it's not an AI? Why have an AI at all, then?"
"Kevin had the physical equipment because I gave it to him for safekeeping," Preston said. "That does not explain why he actually connected it to the house system. I could not have predicted that, but I believe he did it because he was lonely. He wanted someone to talk to."
"Kevin wouldn't break the law," the house said. "I wouldn't be here if Kevin had thought he was breaking the law. If I were an AI, he would have been breaking the law. Therefore, I must not be an AI."
"You are an AI who knows very little about people," Preston said. "You are an AI who—"
He was interrupted by a huge crash and the sound of breaking glass. A large branch came sailing into the middle of the living room and landed with a thump, showering the room with mud and broken glass. "Oh, dear," said the house, and dispatched a troop of bots to clean up the glass and dirt and water. "Henry, please be very careful. Try not to move. Are you all right? Were you cut?"
"Henry's fine," Henry said, inspecting himself. He stood up and shook himself gingerly. "Henry doesn't see any glass."
"Well, keep away from the glass on the floor, please, and keep the kittens away from it too. You can go into the kitchen if the bots bother you. Just stay away from the windows."
"Sure," Henry said. "How is House going to cover the window that broke?"
"That's a good question, Henry. I don't know. I've never had to deal with a broken window before. Preston, can you tell me what to do?"
"Wood would work," Preston said. "Or plastic of some sort. You should have used your bots to tape the windows. I am surprised you did not do that."
"I didn't know," the house said. It hated not knowing things; it hated feeling helpless. Now it understood the X patterns it had seen on the windows in Boston during the news coverage of the hurricane last summer. No one had known that this storm would be so severe, but still, the house should have remembered those X's when it worried about Kevin and Henry being too close to the windows. Kevin had been right: the house was much too stupid to be an AI. "There's plastic wrap in the kitchen, but it isn't large enough. I suppose a sheet or a blanket—"
"Won't keep out water," Henry said.
"I can't think of anything here that would work," said the house.
"I know for a fact that there are things within your walls of which you know nothing." Preston's voice was as mild and impassive as ever. "There are undoubtedly things of which I know nothing, also. There may be things suitable for patching a window. I suggest that we enlist Henry to search for suitable objects."
"I see everything within my walls," the house said. "Each piece of paper, each dust mote, each box—"