The Glass Cage: Automation and Us (25 page)

When the FAA, in its 2013 safety alert for operators, suggested that airlines encourage pilots to assume manual control of their planes more frequently during flights, it was also taking a stand, if a tentative one, in favor of human-centered automation. Keeping the pilot more firmly in the loop, the agency had come to realize, could reduce the chances of human error, temper the consequences of automation failure, and make air travel even safer than it already is. More automation is not always the wisest choice. The FAA, which employs a large and respected group of human-factors researchers, is also paying close attention to ergonomics as it plans its ambitious “NextGen” overhaul of the nation’s air-traffic-control system. One of the project’s overarching goals is to “create aerospace systems that adapt to, compensate for, and augment the performance of the human.”
34

In the financial industry, the Royal Bank of Canada is also going against the grain of technology-centered automation. At its Wall Street trading desk, it has installed a proprietary software program, called THOR, that actually slows down the transmission of buy and sell orders in a way that protects them from the algorithmic manipulations of high-speed traders. By slowing the orders, RBC has found, trades often end up being executed at more attractive terms for its customers. The bank admits that it’s making a trade-off in resisting the prevailing technological imperative of speedy data flows. By eschewing high-speed trading, it makes a little less money on each trade. But it believes that, over the long run, the strengthening of client loyalty and the reduction of risk will lead to higher profits overall.
35

One former RBC executive, Brad Katsuyama, is going even further. Having watched stock markets become skewed in favor of high-frequency traders, he spearheaded the creation of a new and fairer exchange, called IEX. Opened late in 2013, IEX imposes controls on automated systems. Its software manages the flow of data to ensure that all members of the exchange receive pricing and other information at the same time, neutralizing the advantages enjoyed by predatory trading firms that situate their computers next door to exchanges. And IEX forbids certain kinds of trades and fee schemes that give an edge to speedy algorithms. Katsuyama and his colleagues are using sophisticated technology to level the playing field between people and computers. Some national regulatory agencies are also trying to put the brakes on automated trading, through laws and regulations. In 2012, France placed a small tax on stock trades, and Italy followed suit a year later. Because high-frequency-trading algorithms are usually designed to execute volume-based arbitrage strategies—each trade returns only a minuscule profit, but millions of trades are made in a matter of moments—even a tiny transaction tax can render the programs much less attractive.

S
UCH ATTEMPTS
to rein in automation are encouraging. They show that at least some businesses and government agencies are willing to question the prevailing technology-first attitude. But these efforts remain exceptions to the rule, and their continued success is far from assured. Once technology-centered automation has taken hold in a field, it becomes very hard to alter the course of progress. The software comes to shape how work is done, how operations are organized, what consumers expect, and how profits are made. It becomes an economic and a social fixture. This process is an example of what the historian Thomas Hughes calls “technological momentum.”
36
In its early development, a new technology is malleable; its form and use can be shaped not only by the desires of its designers but also by the concerns of those who use it and the interests of society as a whole. But once the technology becomes embedded in physical infrastructure, commercial and economic arrangements, and personal and political norms and expectations, changing it becomes enormously difficult. The technology is at that point an integral component of the social status quo. Having amassed great inertial force, it continues down the path it’s on. Particular technological components will still become outdated, of course, but they’ll tend to be replaced by new ones that refine and perpetuate the existing modes of operation and the related measures of performance and success.

The commercial aviation system, for example, now depends on the precision of computer control. Computers are better than pilots at plotting the most fuel-efficient routes, and computer-controlled planes can fly closer together than can planes operated by people. There’s a fundamental tension between the desire to enhance pilots’ manual flying skills and the pursuit of ever higher levels of automation in the skies. Airlines are unlikely to sacrifice profits and regulators are unlikely to curtail the capacity of the aviation system in order to give pilots significantly more time to practice flying by hand. The rare automation-related disaster, however horrifying, may be accepted as a cost of an efficient and profitable transport system. In health care, insurers and hospital companies, not to mention politicians, look to automation as a quick fix to lower costs and boost productivity. They’ll almost certainly keep ratcheting up the pressure on providers to automate medical practices and procedures in order to save money, even if doctors have worries about the long-term erosion of their most subtle and valuable talents. On financial exchanges, computers can execute a trade in ten microseconds—that’s one ten-millionth of a second—but it takes the human brain nearly a quarter of a second to respond to an event or other stimulus. A computer can process tens of thousands of trades in the blink of a trader’s eye.
37
The speed of the computer has taken the person out of the picture.

It’s commonly assumed that any technology that comes to be broadly adopted in a field, and hence gains momentum, must be the best one for the job. Progress, in this view, is a quasi-Darwinian process. Many different technologies are invented, they compete for users and buyers, and after a period of rigorous testing and comparison the marketplace chooses the best of the bunch. Only the fittest tools survive. Society can thus be confident that the technologies it employs are the optimum ones—and that the alternatives discarded along the way were flawed in some fatal way. It’s a reassuring view of progress, founded on, in the words of the late historian David Noble, “a simple faith in objective science, economic rationality, and the market.” But as Noble went on to explain in his 1984 book
Forces of Production
, it’s a distorted view: “It portrays technological development as an autonomous and neutral technical process, on the one hand, and a coldly rational and self-regulating process, on the other, neither of which accounts for people, power, institutions, competing values, or different dreams.”
38
In place of the complexities, vagaries, and intrigues of history, the prevailing view of technological progress presents us with a simplistic, retrospective fantasy.

Noble illustrated the tangled way technologies actually gain acceptance and momentum through the story of the automation of the machine tool industry in the years after World War II. Inventors and engineers developed several different techniques for programming lathes, drill presses, and other factory tools, and each of the control methods had advantages and disadvantages. One of the simplest and most ingenious of the systems, called Specialmatic, was invented by a Princeton-trained engineer named Felix P. Caruthers and marketed by a small New York company called Automation Specialties. Using an array of keys and dials to encode and control the workings of a machine, Specialmatic put the power of programming into the hands of skilled machinists on the factory floor. A machine operator, explained Noble, “could set and adjust feeds and speeds, relying upon accumulated experience with the sights, sounds, and smells of metal cutting.”
39
In addition to bringing the tacit know-how of the experienced craftsman into the automated system, Specialmatic had an economic advantage: a manufacturer did not have to pay a squad of engineers and consultants to program its equipment. Caruthers’s technology earned accolades from
American Machinist
magazine, which noted that Specialmatic “is designed to permit complete set-up and programming at the machine.” It would allow the machinist to gain the efficiency benefits of automation while retaining “full control of his machine throughout its entire machining cycle.”
40

But Specialmatic never gained a foothold in the market. While Caruthers was working on his invention, the U.S. Air Force was plowing money into a research program, conducted by an MIT team with long-standing ties to the military, to develop “numerical control,” a digital coding technique that was a forerunner of modern software programming. Not only did numerical control enjoy the benefits of a generous government subsidy and a prestigious academic pedigree; it appealed to business owners and managers who, faced with unremitting labor tensions, yearned to gain more control over the operation of machinery in order to undercut the power of workers and their unions. Numerical control also had the glow of a cutting-edge technology—it was carried along by the burgeoning postwar excitement over digital computers. The MIT system may have been, as the author of a Society of Manufacturing Engineers paper would later write, “a complicated, expensive monstrosity,”
41
but industrial giants like GE and Westinghouse rushed to embrace the technology, never giving alternatives like Specialmatic a chance. Far from winning a tough evolutionary battle for survival, numerical control was declared the victor before competition even began. Programming took precedence over people, and the momentum behind the technology-first design philosophy grew. As for the general public, it never knew that a choice had been made.

Engineers and programmers shouldn’t bear all the blame for the ill effects of technology-centered automation. They may be guilty at times of pursuing narrowly mechanistic dreams and desires, and they may be susceptible to the “technical arrogance” that “gives people an illusion of illimitable power,” in the words of the physicist Freeman Dyson.
42
But they’re also responding to the demands of employers and clients. Software developers always face a trade-off in writing programs for automating work. Taking the steps necessary to promote the development of expertise—restricting the scope of automation, giving a greater and more active role to people, encouraging the development of automaticity through rehearsal and repetition—entails a sacrifice of speed and yield. Learning requires inefficiency. Businesses, which seek to maximize productivity and profit, would rarely, if ever, accept such a trade-off. The main reason they invest in automation, after all, is to reduce labor costs and streamline operations.

As individuals, too, we almost always seek efficiency and convenience when we decide which software application or computing device to use. We pick the program or gadget that lightens our load and frees up our time, not the one that makes us work harder and longer. Technology companies naturally cater to such desires when they design their wares. They compete fiercely to offer the product that requires the least effort and thought to use. “At Google and all these places,” says Google executive Alan Eagle, explaining the guiding philosophy of many software and internet businesses, “we make technology as brain-dead easy to use as possible.”
43
When it comes to the development and use of commercial software, whether it underpins an industrial system or a smartphone app, abstract concerns about the fate of human talent can’t compete with the prospect of saving time and money.

I asked Parasuraman whether he thinks society will come to use automation more wisely in the future, striking a better balance between computer calculation and personal judgment, between the pursuit of efficiency and the development of expertise. He paused a moment and then, with a wry laugh, said, “I’m not very sanguine.”

Interlude, with Grave Robber

I
WAS IN A FIX
. I had—by necessity, not choice—struck up an alliance with a demented grave robber named Seth Briars. “I don’t eat, I don’t sleep, I don’t wash, and I don’t care,” Seth had informed me, not without a measure of pride, shortly after we met in the cemetery beside Coot’s Chapel. He knew the whereabouts of certain individuals I was seeking, and in exchange for leading me to them, he had demanded that I help him cart a load of fresh corpses out past Critchley’s Ranch to a dusty ghost town called Tumbleweed. I drove Seth’s horse-drawn wagon, while he stayed in the back, rifling the dead for valuables. The trip was a trial. We made it through an ambush by highwaymen along the route—with firearms, I was more than handy—but when I tried to cross a rickety bridge near Gaptooth Ridge, the weight of the bodies shifted and I lost control of the horses. The wagon careened into a ravine, and I died in a volcanic, screen-coating eruption of blood. I came back to life after a couple of purgatorial seconds, only to go through the ordeal again. After a half-dozen failed attempts, I began to despair of ever completing the mission.

Other books

Second Kiss by Palmer, Natalie
The Darkening by Robin T. Popp
Dread Locks by Neal Shusterman
BUFF by Burns, Mandy
In the Air Tonight by Lori Handeland
Banished Worlds by Grant Workman, Mary Workman
Pond: Stories by Claire-Louise Bennett
Identity Theft by Ron Cantor
Battle Angel by Scott Speer


readsbookonline.com Copyright 2016 - 2024