The Glass Cage: Automation and Us (16 page)

In the second half of the last century, the relation between worker and machine grew more complicated. As companies expanded, technological progress accelerated, and consumer spending exploded, employment branched out into new forms. Managerial, professional, and clerical positions proliferated, as did jobs in the service sector. Machines assumed a welter of new forms as well, and people used them in all sorts of ways, on the job and off. The Taylorist ethos of achieving efficiency through the standardization of work processes, though still exerting a strong influence on business operations, was tempered in some companies by a desire to tap workers’ ingenuity and creativity. The coglike employee was no longer the ideal. Brought into this situation, the computer quickly took on a dual role. It served a Taylorist function of monitoring, measuring, and controlling people’s work; companies found that software applications provided a powerful means for standardizing processes and preventing deviations. But in the form of the PC, the computer also became a flexible, personal tool that granted individuals greater initiative and autonomy. The computer was both enforcer and emancipator.

As the uses of automation multiplied and spread from factory to office, the strength of the connection between technological progress and the deskilling of labor became a topic of fierce debate among sociologists and economists. In 1974, the controversy came to a head when Harry Braverman, a social theorist and onetime coppersmith, published a passionate book with a dry title,
Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century
. In reviewing recent trends in employment and workplace technology, Braverman argued that most workers were being funneled into routine jobs that offered little responsibility, little challenge, and little opportunity to gain know-how in anything important. They often acted as accessories to their machines and computers. “With the development of the capitalist mode of production,” he wrote, “the very concept of skill becomes degraded along with the degradation of labor, and the yardstick by which it is measured shrinks to such a point that today the worker is considered to possess a ‘skill’ if his or her job requires a few days’ or weeks’ training, several months of training is regarded as unusually demanding, and the job that calls for a learning period of six months or a year—such as computer programming—inspires a paroxysm of awe.”
32
The typical craft apprenticeship, he pointed out, by way of comparison, had lasted at least four years and often as many as seven. Braverman’s dense, carefully argued treatise was widely read. Its Marxist perspective fit with the radical atmosphere of the 1960s and early 1970s as neatly as a tenon in a mortise.

Braverman’s argument didn’t impress everyone.
33
Critics of his work—and there were plenty—accused him of overstating the importance of traditional craft workers, who even in the eighteenth and nineteenth centuries hadn’t accounted for all that large a proportion of the labor force. They also thought he placed too much value on the manual skills associated with blue-collar production jobs at the expense of the interpersonal and analytical skills that come to the fore in many white-collar and service posts. The latter criticism pointed to a bigger problem, one that complicates any attempt to diagnose and interpret broad shifts in skill levels across the economy. Skill is a squishy concept. Talent can take many forms, and there’s no good, objective way to measure or compare them. Is an eighteenth-century cobbler making a pair of shoes at a bench in his workshop more or less skilled than a twenty-first-century marketer using her computer to develop a promotional plan for a product? Is a plasterer more or less skilled than a hairdresser? If a pipefitter in a shipyard loses his job and, after some training, finds new work repairing computers, has he gone up or down the skill ladder? The criteria necessary to provide good answers to such questions elude us. As a result, debates about trends in deskilling, not to mention upskilling, reskilling, and other varieties of skilling, often bog down in bickering over value judgments.

But if the broad skill-shift theories of Braverman and others are fated to remain controversial, the picture becomes clearer when the focus shifts to particular trades and professions. In case after case, we’ve seen that as machines become more sophisticated, the work left to people becomes less so. Although it’s now been largely forgotten, one of the most rigorous explorations of the effect of automation on skills was completed during the 1950s by the Harvard Business School professor James Bright. He examined, in exhaustive detail, the consequences of automation on workers in thirteen different industrial settings, ranging from an engine-manufacturing plant to a bakery to a feed mill. From the case studies, he derived an elaborate hierarchy of automation. It begins with the use of simple hand tools and proceeds up through seventeen levels to the use of complex machines programmed to regulate their own operation with sensors, feedback loops, and electronic controls. Bright analyzed how various skill requirements—physical effort, mental effort, dexterity, conceptual understanding, and so on—change as machines become more fully automated. He found that skill demands increase only in the very earliest stages of automation, with the introduction of power hand tools. As more complex machines are introduced, skill demands begin to slacken, and the demands ultimately fall off sharply when workers begin to use highly automated, self-regulating machinery. “It seems,” Bright wrote in his 1958 book
Automation and Management
, “that the more automatic the machine, the less the operator has to do.”
34

To illustrate how deskilling proceeds, Bright used the example of a metalworker. When the worker uses simple manual tools, such as files and shears, the main skill requirements are job knowledge, including in this case an appreciation of the qualities and uses of metal, and physical dexterity. When power hand tools are introduced, the job grows more complicated and the cost of errors is magnified. The worker is called on to display “new levels of dexterity and decision-making” as well as greater attentiveness. He becomes a “machinist.” But when hand tools are replaced by mechanisms that perform a series of operations, such as milling machines that cut and grind blocks of metal into precise three-dimensional shapes, “attention, decision-making, and machine control responsibilities are partially or largely reduced” and “the technical knowledge requirement of machine functioning and adjustment is reduced tremendously.” The machinist becomes a “machine operator.” When mechanization becomes truly automatic—when machines are programmed to control themselves—the worker “contributes little or no physical or mental effort to the production activity.” He doesn’t even require much job knowledge, as that knowledge has effectively gone into the machine through its design and coding. His job, if it still exists, is reduced to “patrolling.” The metalworker becomes “a sort of watchman, a monitor, a helper.” He might best be thought of as “a liaison man between machine and operating management.” Overall, concluded Bright, “the progressive effect of automation is first to relieve the operator of manual effort and then to relieve him of the need to apply continuous mental effort.”
35

When Bright began his study, the prevailing assumption, among business executives, politicians, and academics alike, was that automated machinery would demand greater skills and training on the part of workers. Bright discovered, to his surprise, that the opposite was more often the case: “I was startled to find that the upgrading effect had not occurred to anywhere near the extent that is often assumed. On the contrary, there was more evidence that automation had reduced the skill requirements of the operating work force.” In a 1966 report for a U.S. government commission on automation and employment, Bright reviewed his original research and discussed the technological developments that had occurred in the succeeding years. The advance of automation, he noted, had continued apace, propelled by the rapid deployment of mainframe computers in business and industry. The early evidence suggested that the broad adoption of computers would continue rather than reverse the deskilling trend. “The lesson,” he wrote, “should be increasingly clear—it is not necessarily true that highly complex equipment requires skilled operators. The ‘skill’ can be built into the machine.”
36

I
T MAY
seem as though a factory worker operating a noisy industrial machine has little in common with a highly educated professional entering esoteric information through a touchscreen or keyboard in a quiet office. But in both cases, we see a person sharing a job with an automated system—with another party. And, as Bright’s work and subsequent studies of automation make clear, the sophistication of the system, whether it operates mechanically or digitally, determines how roles and responsibilities are divided and, in turn, the set of skills each party is called upon to exercise. As more skills are built into the machine, it assumes more control over the work, and the worker’s opportunity to engage in and develop deeper talents, such as those involved in interpretation and judgment, dwindles. When automation reaches its highest level, when it takes command of the job, the worker, skillwise, has nowhere to go but down. The immediate product of the joint machine-human labor, it’s important to emphasize, may be superior, according to measures of efficiency and even quality, but the human party’s responsibility and agency are nonetheless curtailed. “What if the cost of machines that think is people who don’t?” asked George Dyson, the technology historian, in 2008.
37
It’s a question that gains salience as we continue to shift responsibility for analysis and decision making to our computers.

The expanding ability of decision-support systems to guide doctors’ thoughts, and to take control of certain aspects of medical decision making, reflects recent and dramatic gains in computing. When doctors make diagnoses, they draw on their knowledge of a large body of specialized information, learned through years of rigorous education and apprenticeship as well as the ongoing study of medical journals and other relevant literature. Until recently, it was difficult, if not impossible, for computers to replicate such deep, specialized, and often tacit knowledge. But inexorable advances in processing speed, precipitous declines in data-storage and networking costs, and breakthroughs in artificial-intelligence methods such as natural language processing and pattern recognition have changed the equation. Computers have become much more adept at reviewing and interpreting vast amounts of text and other information. By spotting correlations in the data—traits or phenomena that tend to be found together or to occur simultaneously or sequentially—computers are often able to make accurate predictions, calculating, say, the probability that a patient displaying a set of symptoms has or will develop a particular disease or the odds that a patient with a certain disease will respond well to a particular drug or other treatment regimen.

Through machine-learning techniques like decision trees and neural networks, which dynamically model complex statistical relationships among phenomena, computers are also able to refine the way they make predictions as they process more data and receive feedback about the accuracy of earlier guesses.
38
The weightings they give different variables get more precise, and their calculations of probability better reflect what happens in the real world. Today’s computers get smarter as they gain experience, just as people do. New “neuromorphic” microchips, which have machine-learning protocols hardwired into their circuitry, will boost computers’ learning ability in coming years, some computer scientists believe. Machines will become more discerning. We may bristle at the idea that computers are “smart” or “intelligent,” but the fact is that while they may lack the understanding, empathy, and insight of doctors, computers are able to replicate many of the judgments of doctors through the statistical analysis of large amounts of digital information—what’s come to be known as “big data.” Many of the old debates about the meaning of intelligence are being rendered moot by the brute number-crunching force of today’s data-processing machines.

The diagnostic skills of computers will only get better. As more data about individual patients are collected and stored, in the form of electronic records, digitized images and test results, pharmacy transactions, and, in the not-too-distant future, readings from personal biological sensors and health-monitoring apps, computers will become more proficient at finding correlations and calculating probabilities at ever finer levels of detail. Templates and guidelines will become more comprehensive and elaborate. Given the current stress on achieving greater efficiency in health care, we’re likely to see the Taylorist ethos of optimization and standardization take hold throughout the medical field. The already strong trend toward replacing personal clinical judgment with the statistical outputs of so-called evidence-based medicine will gain momentum. Doctors will face increasing pressure, if not outright managerial fiat, to cede more control over diagnoses and treatment decisions to software.

Other books

Kidnapped by the Billionaire by Jackie Ashenden
Snobbery with Violence by Beaton, M.C.
The Little Bride by Anna Solomon
The Shroud of Heaven by Sean Ellis
The Standing Water by David Castleton
Micah's Calling by Lynne, Donya


readsbookonline.com Copyright 2016 - 2024