Authors: Alex Boese
Every year proud parents collectively shoot hundreds of thousands—perhaps millions—of hours of baby movies. But no one shoots more than Deb Roy.
As of January 2006, when his kid was six months old, Roy had amassed 24,000 hours of baby video. By the time his child is three, Roy hopes to have 400,000 hours of audio and video, enough to make any dinner guest shudder.
Roy has managed this feat by installing eleven overhead omnidirectional megapixel fish-eye video cameras in the rooms of his home, as well as fourteen microphones. Literally every move and utterance his baby makes is recorded. The data, almost three hundred gigabytes’ worth of it a day, is continuously streamed to a five-terabyte disk cache in his basement. Since installing the system, he’s seen his electricity bill quadruple.
Roy isn’t doing this for the sake of parental pride, though that certainly plays a part. He is head of the MIT Media Lab’s Cognitive Machines Research Group. When his wife became pregnant he realized he had the perfect opportunity to study how children learn language. Roy plans on recording almost everything his son hears and sees from birth until the age of three (in mid-2008). Powerful computers at the MIT media lab will then analyze the footage, searching for the visual and verbal clues the child has used to construct his understanding of language. A one-million-gigabyte storage system—one of the largest storage arrays in the world—has been built to hold all the data. The MIT engineers will then attempt to build a model of language acquisition out of the data. With luck, this model can be used to design a machine learning-system that can mimic a human baby’s ability to learn language.
Roy calls his experiment the Human Speechome Project. Speechome stands for “Speech at home.” The media, however, have dubbed it the Baby Brother Project. But, unlike the
Big Brother
contestants, Roy hasn’t totally sacrificed his privacy. The cameras in every room have an
ON/OFF
switch, as well as an
OOPS
button that deletes the last few minutes of
58
activity. Roy notes that the oops button was used 109 times during the first six months the system was in place, although he doesn’t state why. If it was used because, “Oops, I just said a bad word,” that omission could undermine the purpose of the project. MIT analysts will be scratching their heads wondering, “How in the world did he learn to say that?” Of course, Roy can always do what millions of parents do—blame it on the TV.
Back in the Middle Ages, toilets—the few that existed—were placed at the top of castle turrets. Waste products slid down a chute into the moat below. This was one reason swimming the moat was an unappealing prospect. The anonymous author of
The Life of St. Gregory
praised the solitude of this lofty perch for the “uninterrupted reading” it allowed. This admission made him one of the first bathroom readers in recorded history. Today many people do much of their best reading while on the loo. It is possible you are reading these very words in such a situation. Toilet readers can be divided into two types: relaxers and workers. For relaxers the toilet is a place of retreat and tranquillity. Dr. Harold Aaron, author of
Our Common Ailment, Constipation
, notes, “Reading gives the initial feeling of relaxation so useful for proper performance.” Workers, on the other hand, dare not waste even those few minutes spent attending to bodily functions. Lord Chesterfield tells of “a gentleman who was so good a manager of his time that he would not even lose that small portion of it which the call of nature obliged him to pass in the necessary-house; but gradually went through all the Latin poets, in those moments.” This chapter is dedicated to toilet readers of all persuasions. Gathered in the following pages are unusual experiments that speak, in some way, to loo-related themes. May they help set the mood for peak performance.
The yellow-fever patient groaned as he lay in bed. His skin had a sickly lemon tinge, marred by red and brown spots. The smell of decay clung to him. Suddenly he jerked upward and leaned over the side of the bed. Black vomit, like thick coffee grounds, gushed from his mouth. A young doctor sitting by his side expertly caught the spew in a bucket and patted the patient on the back. “Get it all out,” he said. A few final mucus-laced black globs dribbled from the patient’s mouth before the man collapsed onto the bed. The doctor swirled the steaming liquid in the bucket a few times, examining it closely. The stench of it was overpowering, but barely a flicker of disapproval registered on the doctor’s face. Instead he calmly poured the vomit into a cup, lifted it to his lips, and slowly and deliberately drank it down.
The vomit-imbibing doctor was Stubbins Ffirth. As successive yellow-fever epidemics devastated the population of Philadelphia during the early nineteenth century, Ffirth made a name for himself by courageously exposing himself to the disease to prove his firm belief that yellow fever was noncontagious.
Ffirth confessed that when he first saw the ravages of yellow fever he, like everyone else, believed it to be contagious. But subsequent observation dissuaded him of this. The disease ran riot during the sweltering summer months, but disappeared as winter approached. Why, he wondered, would weather affect a contagious disease? And why didn’t he grow sick, despite his constant contact with patients? He concluded yellow fever was actually “a disease of increased excitement” brought on by an excess of stimulants such as heat, food, and noise. If only people would calm down, he theorized, they would not develop the disease.
To prove his noncontagion hypothesis, Ffirth devised a series of tests. First he confined a dog to a room and fed it bread soaked in the characteristic black vomit regurgitated by yellow-fever victims. (The blackness is caused by blood hemorrhaging from the gastrointestinal tract.) The animal did not grow sick. In fact, “at the expiration of three days he became so fond of it, that he would eat the ejected matter without bread.” Pet-food manufacturers might want to take note.
Emboldened by this success, Ffirth moved on to a human subject, himself:
On the 4th of October, 1802, I made an incision in my left arm, mid way between the elbow and wrist, so as to draw a few drops of blood; into the incision I introduced some fresh black vomit; a slight degree of inflammation ensued, which entirely subsided in three days, and the wound healed up very readily.
Ffirth’s experiments grew progressively bolder. He made deeper incisions in his arms, into which he poured black vomit. He dribbled the stuff into his eyes. He cooked some on a skillet and forced himself to inhale the fumes. He filled a room with heated regurgitation vapors—a vomit sauna—and remained there for two hours, breathing in the air. He experienced a “great pain in my head, some nausea, and perspired very freely,” but otherwise was okay.
He then began ingesting the vomit. He fashioned some of the black matter into pills and swallowed them. Next, he mixed half an ounce of fresh vomit with water and drank it. “The taste was very slightly acid,” he wrote. “It is probable that if I had not, previous to the two last experiments, accustomed myself to tasting and smelling it, that emesis would have been the consequence.” Finally, he gathered his courage and quaffed pure, undiluted black vomit. Having apparently acquired a taste for the stuff, he even included in his treatise a recipe for black-vomit liqueur:
If black vomit be strained through a rag, and the fluid thus obtained be put in a bottle or vial, leaving about one-third part of it empty, this being corked and sealed, if set by for one or two years, will assume a pale red colour, and taste as though it contained a portion of alkahol.
Despite his Herculean efforts to infect himself, Ffirth had still not contracted yellow fever. He momentarily considered declaring his point proven, but more yellow-fever-tainted fluids remained to be tested: blood, saliva, perspiration, and urine. So he soldiered on, liberally rubbing all of these substances into incisions in his arms. The urine produced the greatest reaction, causing “some degree of inflammation.” But even this soon subsided. And he was still disease free.
Ffirth now felt justified in declaring his hypothesis proven. Yellow fever
had to be
noncontagious. Unfortunately, he was wrong. We now know that yellow fever is caused by a tiny RNA virus spread by mosquitoes. This explains why Ffirth observed seasonal variations in the spread of the disease. The
59
epidemic retreated in winter as the mosquito population lessened.
How Ffirth failed to contract the disease is a bit of a mystery, considering he was rubbing infected blood into wounds on his arms. Christian Sandrock, a professor at UC Davis and an expert on infectious diseases, speculates that he simply got lucky. Yellow fever, much like other mosquito-borne diseases such as West Nile virus, requires direct transmission into the bloodstream to cause infection. So Ffirth happened to pick the right virus to smear all over himself. Had he done the same thing with smallpox, he would have been a goner.
Although Ffirth made a bad guess about the cause of the disease, his experiments weren’t entirely in vain. He submitted his research to the University of Pennsylvania to satisfy the requirements for the degree of Doctor of Medicine, which was subsequently granted to him. Modern graduate students who complain about the excessive demands of their thesis committees might want to keep his example in mind. They don’t realize how easy they have it.
Imagine a dog turd. Some unknown pooch deposited it on a lawn weeks ago. Since then it’s been baking in the sun until it’s formed hard, crusty ridges. Would you want to pick this up and eat it? Of course not.
Now imagine it’s 1986 and you’re an undergraduate at the University of Pennsylvania. You volunteered to participate in a food-preferences study, and you find yourself sitting in a small, square laboratory room. You’ve just been given a piece of fudge to eat, and it was very good. The researcher now presents you with two more pieces of fudge, but there’s obviously a trick. The fudge morsel on the left is molded into the form of a disk, but the one on the right is in the shape of a “surprisingly realistic piece of dog feces.”
“Please indicate which piece you would prefer,” the researcher asks in a serious tone.
This question was posed to people as part of a study designed by Paul Rozin, an expert in the psychology of disgust. The responses his team got were no surprise. Participants overwhelmingly preferred the disk-shaped fudge, rating it almost fifty points higher on a two-hundred-point preference scale.
The researchers exposed subjects to a variety of gross-out choices. In each situation the options were equally hygienic—there was never any risk of bacterial infection—but one was always distinctly more stomach turning than the other.
They offered volunteers a choice between a glass of apple juice into which a candleholder had been dipped, or one in which a dried sterilized cockroach had been dunked. “Which would you prefer?” the researcher asked as he dropped the cockroach into the glass and stirred it around. The roach juice scored one hundred points lower on the preference scale.
Would volunteers prefer to hold a clean rubber sink stopper between their teeth, or a piece of rubber imitation vomit? The sink stopper won out.
Would they be willing to eat a bowl of fresh soup stirred by an unused fly swatter, or poured into a brand-new bedpan? “No, thank you,” participants responded in both cases.
The researchers offered the results of their study as
60
evidence of the “laws of sympathetic magic” at work in American culture. These laws were named and first described by the nineteenth-century anthropologist Sir James Frazer in his classic work
The Golden Bough
, an encyclopedic survey of the belief systems of “primitive” cultures around the world.
Frazer noticed two forms of belief showing up again and again. First, there was the law of contagion: “Once in contact, always in contact.” If an offensive object touches a neutral object, such as a cockroach touching some juice, the neutral object becomes tainted by the contact. Second, there was the law of similarity: “The image equals the object.” Two things that look alike are thought to share the same properties. A voodoo doll that looks like a person becomes equivalent to that person. Or fudge that looks like feces is as disgusting as feces.
In the modern world we like to think of ourselves as being quite rational. We understand the principles of good hygiene, and how diseases are transmitted. We like to imagine we’re not ruled by simple superstitions. And yet, of course, we are. As the researchers put it, “We have called attention to some patterns of thinking that have generally been considered to be restricted to preliterate or Third World cultures.” The curious thing is that most of us will readily acknowledge the illogic of these beliefs. We know the laws of sympathetic magic aren’t real. But we’re still not about to eat that doggy-doo fudge.
A man sits in a stall at a public lavatory on a college campus. He has been there for well over an hour. At his feet is a stack of books, and hidden among these books is a small periscope he is using to peer beneath the door and watch a man standing at a urinal, going to the bathroom. Through his periscope the stall-sitter can see the stream of urine trickling down the porcelain into the drain. The urine stops, and he immediately presses a button on a stopwatch he holds in his hand.
The man in the stall was not a Peeping Tom. He was actually a reputable researcher conducting a scientific experiment. At least that was his story, and he was sticking to it.
The location of this peculiar scene was “a men’s lavatory at a midwestern U.S. university.” The date was sometime in the mid-1970s. Let’s momentarily follow a hypothetical person, probably a student at the university, who wandered into the lavatory and unwittingly became a participant in the stall-sitter’s experiment. We’ll call him Joe.