On Immunity : An Inoculation (9781555973278) (17 page)

S
HORTLY AFTER HE TURNED FOUR, my son slept in my arms like a heavy newborn baby while a doctor impressed on me that his allergies, which now included some food allergies, could pose a serious threat to his health. My observations, in part, had brought us to this diagnosis, but I doubted both myself and the doctor as I looked down at my son, who appeared perfectly unthreatened in his sleep. After the doctor left the room, a nurse demonstrated the EpiPen I would need to use if my son ever had a life-threatening reaction to nuts. “I know,” she said when she saw tears well up in my eyes while she pretended to jab herself forcefully in the thigh with the syringe. “I hope you never have to do this.” Later I would dutifully read all the information the doctor had given me, while still maintaining the secret belief that none of it was true and that food could not hurt my child.

In the lists upon lists of things my son was advised by the doctor to avoid, one item in particular caught my attention—the seasonal flu shot. Children with egg allergies can react to this particular vaccine, which is grown in eggs. My son had already been vaccinated against the flu, just as he had already eaten many eggs, but I could see the irony in the possibility that a vaccine posed him special danger. Thinking with the logic of a Greek myth, I wondered if my interest in immunity had somehow invited immune dysfunction for him. Maybe I had given him, like poor Icarus, fragile wings.

I did not admit this fear to the doctor, but I did ask her what I had done to cause these allergies. I hoped to reverse the damage, or at least stem it. The possibility that I was not to blame did not initially occur to me. The doctor, herself a mother, spent some time assuring me that although the origin of allergies is mysterious, there was probably nothing I could have done differently. I myself have allergies, as does my husband, so if I was to blame, she suggested, it was only for carrying the genetic material I carry. This did not satisfy me. Neither did anything I went on to learn about allergies, about which we seem to know very little.

There is a passage in Daniel Defoe’s
A Journal of the Plague Year
in which his narrator wonders how the disease finds its victims. He does not believe, as others do, that it is simply a “Stroke from Heaven.” He is certain that it is passed from one person to another, “Spread by Infection, that is to say, by some certain Steams, or Fumes, which the Physicians call
Effluvia
, by the Breath, or by the Sweat, or by the Stench of the Sores of the sick Persons, or some other way, perhaps, beyond even the Reach of the Physicians themselves …” Indeed, it would be over 150 years before physicians would know that the plague is passed by fleas.

As the plague spreads, Defoe’s narrator has an understanding that contagion is at work, and some inkling of germ theory, but he rejects that theory. The idea of “invisible Creatures, who enter into the Body with the Breath, or even at the Pores with the Air, and there generate, or emit most acute Poisons” strikes him as unlikely. He has heard that if a person with the plague breathes on a piece of glass, “there might living Creatures be seen by microscope of strange monstrous and frightful Shapes, such as Dragons, Snakes, Serpents, and Devils, horrible to behold.” But this, he writes, “I very much question the Truth of.” Faced with the plague and unable to make sense of his own observations, the narrator is left to reckon with improbable theories and pure speculation. Several hundred years later, I find his predicament eerily familiar.

Bubonic plague still exists, but it has ceased to be the Plague. The afflictions that take the most lives worldwide are now heart disease, stroke, respiratory infections, and AIDS, which is the only one of these that tends to be characterized as a plague. The number of lives a disease claims, as Susan Sontag observes, is not what makes it a plague. In order to be promoted to plague, a disease must be particularly feared or dreaded. I have lived through the emergence of a number of well-publicized diseases, but I never felt threatened by Ebola, or SARS, or West Nile virus, or H1N1. When my son was an infant, I feared autism, which seemed to be spreading like a plague, particularly among boys. And when he developed allergies, one after another, I began to feel dread. Perhaps the final qualification for what constitutes a plague is its proximity to your own life.

“Can you imagine,” I ask a friend while reading
A Journal of the Plague Year
, “seeing people all around you dying from a disease and not knowing what is causing it, or how it is passed, or who will be next?” Even as I say this, it occurs to me that my friend lived in San Francisco at the height of the AIDS epidemic, and saw nearly everyone he knew die of a disease about which almost nothing was known. San Francisco in 1989, he reminds me, was not entirely unlike London in 1665.

Later, perhaps because I am still reckoning with the strangeness of how both near and far the plague of London feels to my own time and place, I ask the same question again. “Can you imagine?” I ask my father. By his silence I understand that he can. My father sees the sick every day—a plague is endlessly unfolding before him. “We don’t have bodies falling out of windows,” I say to him hopefully. “We aren’t digging mass graves.”

“Yes,” he says, “but we’re seeding a bomb.” He is referring to antibiotic-resistant bacteria. The overuse of antibiotics has led to strains of bacteria that are difficult to rid from the body. One,
C. difficile
, is even named for its difficulty. In the case of
C. difficile
, over 90 percent of infections occur following a course of antibiotics. An alarming number of the patients he sees in the hospital, my father tells me, are infected with resistant bacteria.

The persistence of resistant bacteria and the emergence of novel diseases are among the top public health threats of the twenty-first century. One of these threats comes from within and is the result of our modern practices. The other comes from without and cannot be anticipated by our medicine. Both speak to our most basic fears. But novel diseases, in their capacity to serve as metaphors for foreign others and anxieties about the future, tend to generate better copy. As I write, two new diseases have been making headlines. One is an avian influenza that emerged in China, the other is a novel corona virus that was first detected in Saudi Arabia. The latter, which is the most threatening new disease of the moment, has been given the unfortunate name Middle East respiratory syndrome.

In the past century there have been three major influenza pandemics, including the 1918 Spanish flu pandemic, which killed more people than the First World War. That pandemic proved particularly deadly for young adults with strong immune systems, as it caused an overwhelming immune response. In 2004, the director of the WHO announced that another major pandemic is inevitable. “It’s not a matter of if, but when,” a bioethicist friend tells me. With this probability in the air, novel influenza outbreaks are often accompanied by a flurry of media attention, some of which tips into fearmongering. But even when influenza is made foreign or animal with a new name, like Chinese bird flu or swine flu, we do not seem eager to imagine it as a plague. Influenza is too common to evoke our fear of the unknown. It is not exotic or remote enough to trigger our fear of alien others. It is not disfiguring enough to threaten our sense of self. It is not spread in a way that inspires moral repulsion or the threat of punishment. Influenza does not serve, in other words, as a very good metaphor for other fears—it has to be frightening simply for what it is.

The pediatrician Paul Offit mentioned to me, during an interview about his work, that he had recently seen two children hospitalized with influenza. Both had been immunized against everything on the childhood schedule except the flu, and both ended up on heart and lung machines. One lived, and the other died. “And then the next day, when someone comes into your office and says, ‘I don’t want to get that vaccine,’ you’re supposed to respect that decision?” Offit asked me. “You can respect the fear. The fear of vaccines is understandable. But you can’t respect the decision—it’s an unnecessary risk.”

The fact that the 2009 H1N1 influenza pandemic did not take more lives is sometimes cast, oddly, as a public health failure. “When all was said and done,” Dr. Bob writes, “the hype and fear around the H1N1 flu turned out to be unwarranted.” The pandemic was not as bad as it could have been, but it was not inconsequential. Somewhere between 150,000 and 575,000 people died from H1N1, over half in Southeast Asia and Africa, where public health measures were scarce. Autopsies suggest that many of the previously healthy people who died of the flu were killed by their immune response—they drowned in their own lung fluid.

The complaint that preventive measures against the flu were out of proportion to the threat strikes me as better applied to our military action in Iraq than to our response to an unpredictable virus. Vaccinating in advance of the flu, critics suggest, was a foolish preemptive strike. But preemption in war has different effects than preemption in health care—rather than generating ongoing conflict, like our preemptive strike against Iraq, preventive health care can make further health care unnecessary. Either way, prevention, of war as well as disease, is not our strong point. “The idea of preventive medicine is faintly un-American,” the
Chicago Tribune
noted in 1975. “It means, first, recognizing that the enemy is us.”

In 2011, studies of an H1N1 vaccine used only in Europe revealed that it caused an increased incidence of narcolepsy in Finland and Sweden. Initial reports suggest that the vaccine triggered narcolepsy in about 1 out of every 12,000 teenagers vaccinated in Finland, and about 1 out of 33,000 in Sweden. Research is ongoing and there is more to know, particularly how exactly the vaccine may have contributed to narcolepsy in that particular age group and population, but the incident has already been used to confirm existing fears that we are our own enemy. A problem with a vaccine is not evidence of the inevitable shortcomings of medicine, but evidence that we are, indeed, going to destroy ourselves.

“Apocalypse,” Sontag writes, “is now a long-running serial: not ‘Apocalypse Now’ but ‘Apocalypse From Now On.’ Apocalypse has become an event that is happening and not happening.” In this era of uncertain apocalypse, my father has taken to reading the Stoics, which is not an entirely surprising interest for an oncologist. What he is drawn to in their philosophy, he tells me, is the idea that you cannot control what happens to you, but you can control how you feel about it. Or, as Jean-Paul Sartre put it, “Freedom is what you do with what’s been done to you.”

What has been done to us seems to be, among other things, that we have been made fearful. What will we do with our fear? This strikes me as a central question of both citizenship and motherhood. As mothers, we must somehow square our power with our powerlessness. We can protect our children to some extent. But we cannot make them invulnerable any more than we can make ourselves invulnerable. “Life,” as Donna Haraway writes, “is a window of vulnerability.”

D
RACULA’S FIRST VICTIM when he arrives in England is a beautiful young woman who is found weak and pale each morning but is kept alive through a series of blood transfusions. Luckily, she has three men in love with her, all eager to give their blood. One of them writes in his diary, “No man knows till he experiences it, what it is to feel his own life-blood drawn away into the veins of the woman he loves.” Dracula has a taste for good-looking women, but he does not, so far as we know, feel love. Bram Stoker’s Dracula did not become a vampire to spend immortality searching for his one true love, as Francis Ford Coppola’s adaptation suggests. He was always heartless, even in his mortal incarnation as Vlad the Impaler. Dracula, after all, is not a person so much as he is the embodiment of disease. And the vampire hunters who pursue him are not people so much as they are metaphors for the best impulses of medicine. Vampires take blood, and vampire hunters give blood.

As I wait with my arm outstretched to give blood, I consider this distinction. My son, who has taken to wearing a cape, likes to talk about bad guys and good guys, despite my insistence that most people are both. We are both vampires and vampire hunters, caped and uncaped. I think of Stephen King’s daughter, Naomi King, who once explained that while she does not care for horror as a genre she does care about theological questions of how we make friends with our monsters. “If we demonize other people,” she said, “and create monsters out of each other and act monstrous—and we all have that capacity—then how do we not become monsters ourselves?”

“Want some blood?” my son recently asked me, and held the battery terminals of a defunct smoke detector against my arm in a mock transfusion. When the operation was complete, he said proudly, “Now you don’t have to eat.” He thinks I am a vampire. And I am, in ways. I am here giving blood as an antidote to my own vampirism. I am also giving to repay the loan I was given by some other anonymous donor. I try to imagine that donor now as I look at the people in the chairs across from me—a muscular man studying flash cards and a middle-aged woman reading a novel and a man in a business suit looking at his phone. They are the same people I might see waiting for the train, but here they are bathed in an aura of altruism.

The reasons people give blood cannot be explained by personal gain, that much we know. This does not mean that nobody has anything to gain by giving blood. In quite a few countries, including the United States, it is common practice to offer “incentives” for blood donation. In 2008, the Red Cross ran a blood drive themed “Give a Little, Buy a Lot,” in which donors had a chance to win a gift card worth $1,000.
Give a Little, Buy a Lot
would also seem to be the theme of contemporary American life, and the spirit of our high holidays. But there is some research by economists suggesting that incentives can actually discourage blood donation. Offering incentives for giving, one study concluded, can insult people who want to give just for the sake of giving.

Other books

The v Girl by Mya Robarts
Jailbait by Emily Goodwin
Savage Arrow by Cassie Edwards
Black Tide Rising by R.J. McMillen
A Roux of Revenge by Connie Archer
Maureen's Choice by Charles Arnold
Predator's Salvation by McKeever, Gracie C.
Dreaming Spies by Laurie R. King


readsbookonline.com Copyright 2016 - 2024