The Glass Cage: Automation and Us (15 page)

What cloning shears away is nuance. Nearly all the contents of a typical electronic record “is boilerplate,” one internist told Hoff. “The story’s just not there. Not in my notes, not in other doctors’ notes.” The cost of diminished specificity and precision is compounded as cloned records circulate among other doctors. Physicians end up losing one of their main sources of on-the-job learning. The reading of dictated or handwritten notes from specialists has long provided an important educational benefit for primary-care doctors, deepening their understanding not only of individual patients but of everything from “disease treatments and their efficacy to new modes of diagnostic testing,” Hoff writes. As those reports come to be composed more and more of recycled text, they lose their subtlety and originality, and they become much less valuable as learning tools.
15

Danielle Ofri, an internist at Bellevue Hospital in New York City who has written several books on the practice of medicine, sees other subtle losses in the switch from paper to electronic records. Although flipping through the pages of a traditional medical chart may seem archaic and inefficient these days, it can provide a doctor with a quick but meaningful sense of a patient’s health history, spanning many years. The more rigid way that computers present information actually tends to foreclose the long view. “In the computer,” Ofri writes, “all visits look the same from the outside, so it is impossible to tell which were thorough visits with extensive evaluation and which were only brief visits for medication refills.” Faced with the computer’s relatively inflexible interface, doctors often end up scanning a patient’s records for “only the last two or three visits; everything before that is effectively consigned to the electronic dust heap.”
16

A recent study of the shift from paper to electronic records at University of Washington teaching hospitals provides further evidence of how the format of electronic records can make it harder for doctors to navigate a patient’s chart to find notes “of interest.” With paper records, doctors could use the “characteristic penmanship” of different specialists to quickly home in on critical information. Electronic records, with their homogenized format, erase such subtle distinctions.
17
Beyond the navigational issues, Ofri worries that the organization of electronic records will alter the way physicians think: “The system encourages fragmented documentation, with different aspects of a patient’s condition secreted in unconnected fields, so it’s much harder to keep a global synthesis of the patient in mind.”
18

The automation of note taking also introduces what Harvard Medical School professor Beth Lown calls a “third party” into the exam room. In an insightful 2012 paper, written with her student Dayron Rodriquez, Lown tells of how the computer itself “competes with the patient for clinicians’ attention, affects clinicians’ capacity to be fully present, and alters the nature of communication, relationships, and physicians’ sense of professional role.”
19
Anyone who has been examined by a computer-tapping doctor probably has firsthand experience of at least some of what Lown describes, and researchers are finding empirical evidence that computers do indeed alter in meaningful ways the interactions between physician and patient. In a study conducted at a Veterans Administration clinic, patients who were examined by doctors taking electronic notes reported that “the computer adversely affected the amount of time the physician spent talking to, looking at, and examining them” and also tended to make the visit “feel less personal.”
20
The clinic’s doctors generally agreed with the patients’ assessments. In another study, conducted at a large health maintenance organization in Israel, where the use of EMR systems is more common than in the United States, researchers found that during appointments with patients, primary-care physicians spend between 25 and 55 percent of their time looking at their computer screen. More than 90 percent of the Israeli doctors interviewed in the study said that electronic record keeping “disturbed communication with their patients.”
21
Such a loss of focus is consistent with what psychologists have learned about how distracting it can be to operate a computer while performing some other task. “Paying attention to the computer and to the patient requires multitasking,” observes Lown, and multitasking “is the opposite of mindful presence.”
22

The intrusiveness of the computer creates another problem that’s been widely documented. EMR and related systems are set up to provide on-screen warnings to doctors, a feature that can help avoid dangerous oversights or mistakes. If, for instance, a physician prescribes a combination of drugs that could trigger an adverse reaction in a patient, the software will highlight the risk. Most of the alerts, though, turn out to be unnecessary. They’re irrelevant, redundant, or just plain wrong. They seem to be generated not so much to protect the patient from harm as to protect the software vendor from lawsuits. (In bringing a third party into the exam room, the computer also brings in that party’s commercial and legal interests.) Studies show that primary-care physicians routinely dismiss about nine out of ten of the alerts they receive. That breeds a condition known as
alert fatigue
. Treating the software as an electronic boy-who-cried-wolf, doctors begin to tune out the alerts altogether. They dismiss them so quickly when they pop up that even the occasional valid warning ends up being ignored. Not only do the alerts intrude on the doctor-patient relationship; they’re served up in a way that can defeat their purpose.
23

A medical exam or consultation involves an extraordinarily intricate and intimate form of personal communication. It requires, on the doctor’s part, both an empathic sensitivity to words and body language and a coldly rational analysis of evidence. To decipher a complicated medical problem or complaint, a clinician has to listen carefully to a patient’s story while at the same time guiding and filtering that story through established diagnostic frameworks. The key is to strike the right balance between grasping the specifics of the patient’s situation and inferring general patterns and probabilities derived from reading and experience. Checklists and other decision guides can serve as valuable aids in this process. They bring order to complicated and sometimes chaotic circumstances. But as the surgeon and
New Yorker
writer Atul Gawande explained in his book
The Checklist Manifesto
, the “virtues of regimentation” don’t negate the need for “courage, wits, and improvisation.” The best clinicians will always be distinguished by their “expert audacity.”
24
By requiring a doctor to follow templates and prompts too slavishly, computer automation can skew the dynamics of doctor-patient relations. It can streamline patient visits and bring useful information to bear, but it can also, as Lown writes, “narrow the scope of inquiry prematurely” and even, by provoking an automation bias that gives precedence to the screen over the patient, lead to misdiagnoses. Doctors can begin to display “ ‘screen-driven’ information-gathering behaviors, scrolling and asking questions as they appear on the computer rather than following the patient’s narrative thread.”
25

Being led by the screen rather than the patient is particularly perilous for young practitioners, Lown suggests, as it forecloses opportunities to learn the most subtle and human aspects of the art of medicine—the tacit knowledge that can’t be garnered from textbooks or software. It may also, in the long run, hinder doctors from developing the intuition that enables them to respond to emergencies and other unexpected events, when a patient’s fate can be sealed in a matter of minutes. At such moments, doctors can’t be methodical or deliberative; they can’t spend time gathering and analyzing information or working through templates. A computer is of little help. Doctors have to make near-instantaneous decisions about diagnosis and treatment. They have to act. Cognitive scientists who have studied physicians’ thought processes argue that expert clinicians don’t use conscious reasoning, or formal sets of rules, in emergencies. Drawing on their knowledge and experience, they simply “see” what’s wrong—oftentimes making a working diagnosis in a matter of seconds—and proceed to do what needs to be done. “The key cues to a patient’s condition,” explains Jerome Groopman in his book
How Doctors Think
, “coalesce into a pattern that the physician identifies as a specific disease or condition.” This is talent of a very high order, where, Groopman says, “thinking is inseparable from acting.”
26
Like other forms of mental automaticity, it develops only through continuing practice with direct, immediate feedback. Put a screen between doctor and patient, and you put distance between them. You make it much harder for automaticity and intuition to develop.

I
T DIDN’T
take long, after their ragtag rebellion was crushed, for the surviving Luddites to see their fears come true. The making of textiles, along with the manufacture of many other goods, went from handicraft to industry within a few short years. The sites of production moved from homes and village workshops to large factories, which, to ensure access to sufficient laborers, materials, and customers, usually had to be built in or near cities. Craft workers followed the jobs, uprooting their families in a great wave of urbanization that was swollen by the loss of farming jobs to threshers and other agricultural equipment. Inside the new factories, ever more efficient and capable machines were installed, boosting productivity but also narrowing the responsibility and autonomy of those who operated the equipment. Skilled craftwork became unskilled factory labor.

Adam Smith had recognized how the specialization of factory jobs would lead to the deskilling of workers. “The man whose whole life is spent in performing a few simple operations, of which the effects too are, perhaps, always the same, or very nearly the same, has no occasion to exert his understanding, or to exercise his invention in finding out expedients for removing difficulties which never occur,” he wrote in
The Wealth of Nations
. “He naturally loses, therefore, the habit of such exertion, and generally becomes as stupid and ignorant as it is possible for a human creature to become.”
27
Smith viewed the degradation of skills as an unfortunate but unavoidable by-product of efficient factory production. In his famous example of the division of labor at a pin-manufacturing plant, the master pin-maker who once painstakingly crafted each pin is replaced by a squad of unskilled workers, each performing a narrow task: “One man draws out the wire, another straights it, a third cuts it, a fourth points it, a fifth grinds it at the top for receiving the head; to make the head requires two or three distinct operations; to put it on, is a peculiar business, to whiten the pins is another; it is even a trade by itself to put them into the paper; and the important business of making a pin is, in this manner, divided into about eighteen distinct operations.”
28
None of the men knows how to make an entire pin, but working together, each plying his own peculiar business, they churn out far more pins than could an equal number of master craftsmen working separately. And because the workers require little talent or training, the manufacturer can draw from a large pool of potential laborers, obviating the need to pay a premium for expertise.

Smith also appreciated how the division of labor eased the way for mechanization, which served to narrow workers’ skills even further. Once a manufacturer had broken an intricate process into a series of well-defined “simple operations,” it became relatively easy to design a machine to carry out each operation. The division of labor within a factory provided a set of specifications for its machinery. By the early years of the twentieth century, the deskilling of factory workers had become an explicit goal of industry, thanks to Frederick Winslow Taylor’s philosophy of “scientific management.” Believing, in line with Smith, that “the greatest prosperity” would be achieved “only when the work of [companies] is done with the smallest combined expenditure of human effort,” Taylor counseled factory owners to prepare strict instructions for how each employee should use each machine, scripting every movement of the worker’s body and mind.
29
The great flaw in traditional ways of working, Taylor believed, was that they granted too much initiative and leeway to individuals. Optimum efficiency could be achieved only through the standardization of work, enforced by “rules, laws, and formulae” and reflected in the very design of machines.
30

Viewed as a system, the mechanized factory, in which worker and machine merge into a tightly controlled, perfectly productive unit, was a triumph of engineering and efficiency. For the individuals who became its cogs, it brought, as the Luddites had foreseen, a sacrifice not only of skill but of independence. The loss in autonomy was more than economic. It was existential, as Hannah Arendt would emphasize in her 1958 book
The Human Condition
: “Unlike the tools of workmanship, which at every given moment in the work process remain the servants of the hand, the machines demand that the laborer serve them, that he adjust the natural rhythm of his body to their mechanical movement.”
31
Technology had progressed—if that’s the right word—from simple tools that broadened the worker’s latitude to complex machines that constrained it.

Other books

the Dark Light Years by Brian W. Aldiss
Forgive Me by Melanie Walker
elemental 04 - cyclone by ladd, larissa
Having My Baby by Theresa Ragan
Hope Breaks: A New Adult Romantic Comedy by Alice Bello, Stephanie T. Lott
Marley's Menage by Jan Springer
Many Lives by Stephanie Beacham
Casting About by Terri DuLong
Halt's Peril by John Flanagan


readsbookonline.com Copyright 2016 - 2024