Read What Technology Wants Online

Authors: Kevin Kelly

What Technology Wants (23 page)

BOOK: What Technology Wants
9.01Mb size Format: txt, pdf, ePub
ads
It is not much different from the natural world. The birth of any species depends on an ecosystem of other species in place to support, divert, and goad its metamorphosis. We call it coevolution because of the reciprocal influence of one species upon another. In the technium many discoveries await the invention of another technological species: the proper tool or platform. The moons of Jupiter were discovered by a number of folks only a year after the telescope was invented. But the instruments by themselves didn't make the discovery. Celestial bodies were expected by astronomers. Because no one expected germs, it took 200 years after the microscope was invented before Antonie van Leeuwenhoek spied microbes. In addition to instruments and tools, a discovery needs the proper beliefs, expectations, vocabulary, explanation, know-how, resources, funds, and appreciation to appear. But these, too, are fueled by new technologies.
An invention or discovery that is too far ahead of its time is worthless; no one can follow. Ideally, an innovation opens up only the next adjacent step from what is known and invites the culture to move forward one hop. An overly futuristic, unconventional, or visionary invention can fail initially (it may lack essential not-yet-invented materials or a critical market or proper understanding) yet succeed later, when the ecology of supporting ideas catches up. Gregor Mendel's 1865 theories of genetic heredity were correct but ignored for 35 years. His keen insights were not embraced because they did not explain the problems biologists had at the time, nor did his explanation operate by known mechanisms, so his discoveries were out of reach even for the early adopters. Decades later science faced the urgent questions that Mendel's discoveries could answer. Now his insights were only one step away. Within a few years of one another, three different scientists (Hugo de Vries, Karl Erich Correns, and Erich Tschermak) each independently rediscovered Mendel's forgotten work, which of course had been there all along. Kroeber claims that if you had prevented those three from rediscovery and waited another year, six scientists, not just three, would had made the then-obvious next step.
The technium's inherent sequence makes leapfrogging ahead very difficult. It would be wonderful if a society that lacks all technology infrastructure could jump to 100 percent clean, lightweight digital technology and simply skip over the heavy, dirty industrial stage. The fact that billions of poor in the developing world have purchased cheap cell phones and bypassed long waits for industrial-age landline telephones has given hope that other technologies could also leapfrog into the future. But my close examination of cell-phone adoption in China, India, Brazil, and Africa shows that the boom in cell phones around the world is accompanied by a parallel boom in copper-wire landlines. Cell phones don't cancel landlines. Instead, where cell phones go, copper follows. Cell phones train newly educated customers to need higher-bandwidth internet connections and higher-quality voice connections, which then follow in copper wires. Cell phones and solar panels and other potential leapfrog technologies are not skipping over the industrial age as much as sprinting ahead to accelerate industry's overdue arrival.
To a degree that is invisible to us, new tech sits on a foundation of old tech. Despite the vital layer of electrons that constitutes our modern economy, a huge portion of what goes on each day is fairly industrial in scope: moving atoms, rearranging atoms, mining atoms, burning atoms, refining atoms, stacking atoms. Cell phones, web pages, solar panels all rest upon heavy industry, and industry rests upon agriculture.
It is no different with our brains. Most of our brain's activity is spent on primitive processes—like walking—that we can't even perceive consciously. Instead, we are aware of only a thin, newly evolved layer of cognition that sits on and depends upon the reliable workings of older processes. You can't do calculus unless you do counting. Likewise, you can't do cell phones unless you do wires. You can't do digital infrastructure unless you do industrial. For example, a recent high-profile effort to computerize every hospital in Ethiopia was abandoned because the hospitals did not have reliable electricity. According to a study by the World Bank, a fancy technology introduced in developing countries typically reaches only 5 percent penetration before it stalls. It doesn't disseminate further until older foundational technologies catch up. Wisely, low-income countries are still rapidly inhaling industrial technologies. Big-budget infrastructure—roads, waterworks, airports, machine factories, electrical systems, power plants—are needed to make the high-tech stuff work. In a report on technological leapfrogging the
Economist
concluded: “Countries that failed to adopt old technologies are at a disadvantage when it comes to new ones.”
Does this mean that if we were to try to colonize an uninhabited Earth-like planet we would be required to recapitulate history and start with sharp sticks, smoke signals, and mud-brick buildings and then work our way through each era? Would we not try to create a society from scratch using the most sophisticated technology we had?
I think we would try but that it would not work. If we were civilizing Mars, a bulldozer would be as valuable as a radio. Just like the predominance of lower functions in our brains, industrial processes predominate in the technium, even though they are gilded with informational veneers. The demassification of high technology is at times an illusion. Although the technium really does advance by using fewer atoms to do more work, information technology is not an abstract virtual world. Atoms still count. As the technium progresses, it embeds information in materials, in the same way that information and order is embedded in the atoms of a DNA molecule. Advanced high technology is the seamless fusion of bits
and
atoms. It is adding intelligence to industry, rather than removing industry and leaving only information.
Technologies are like organisms that require a sequence of developments to reach a particular stage. Inventions follow this uniform developmental sequence in every civilization and society, independent of human genius. You can't effectively jump ahead when you want to. But when the web of supporting technological species are in place, an invention will erupt with such urgency that it will occur to many people at once. The progression of inventions is in many ways the march toward forms dictated by physics and chemistry in a sequence determined by the rules of complexity. We might call this technology's imperative.
8
Listen to the Technology
In the early 1950s, the same thought occurred to many people at once: Things are improving so fast and so regularly that there might be a pattern to the improvements. Maybe we could plot technological progress to date, then extrapolate the curves and see what the future holds. Among the first to do this systemically was the U.S. Air Force. They needed a long-term schedule of what kinds of planes they should be funding, but aerospace was one of the fastest-moving frontiers in technology. Obviously, they would build the fastest planes possible, but since it took decades to design, approve, and then deliver a new type of plane, the generals thought it prudent to glimpse what futuristic technologies they should be funding.
So in 1953 the Air Force Office of Scientific Research plotted out the history of the fastest air vehicles. The Wright Brothers' first flight reached 6.8 kilometers per hour in 1903, and they jumped to 60 kilometers per hour two years later. The airspeed record kept increasing a bit each year, and in 1947 the fastest flight passed 1,000 kilometers per hour in a Lockheed Shoot Star flown by Colonel Albert Boyd. The record was broken four times in 1953, ending with the F-100 Super Sabre doing 1,215 kilometers per hour. Things were moving fast. And everything was pointed toward space. According to Damien Broderick, the author of
The Spike,
the Air Force
charted the curves and metacurves of speed. It told them something preposterous. They could not believe their eyes.
Speed Trend Curve.
The U.S. Air Force's plot of historical speed records up to the 1950s and their expectations of the fastest speeds in the near future.
The curve said they could have machines that attained orbital speed . . . within four years. And they could get their payload right out of Earth's immediate gravity well just a little later. They could have satellites almost at once, the curve insinuated, and if they wished—if they wanted to spend the money, and do the research and the engineering—they could go to the Moon quite soon after that.
It is important to remember that in 1953 none of the technology for these futuristic journeys existed. No one knew how to go that fast and survive. Even the most optimistic, die-hard visionaries did not expect a lunar landing any sooner than the proverbial “year 2000.” The only voice telling them they could do it sooner was a curve on a piece of paper. But the curve turned out to be correct. Except not politically correct. In 1957 the Soviet Union (not America!) launched Sputnik, right on schedule. Then U.S. rockets zipped to the moon 12 years later. As Broderick notes, humans arrived on the moon “close to a third of a century sooner than loony space travel buffs like Arthur C. Clarke had expected it to occur.”
What did the curve know that Arthur C. Clarke did not? How did it account for the secretive efforts of the Russians as well as dozens of teams around the world? Was the curve a self-fulfilling prophecy or a revelation of an inevitable trend rooted deep in the nature of the technium? The answer may lie in the many other trends plotted since then. The most famous of them all is the trend known as Moore's Law. In brief, Moore's Law predicts that computing chips will shrink by half in size and cost every 18 to 24 months. For the past 50 years it has been astoundingly correct.
It has been steady and true, but does Moore's Law reveal an imperative in the technium? In other words is Moore's Law in some way inevitable? The answer is pivotal for civilization for several reasons. First, Moore's Law represents the acceleration in computer technology, which is accelerating everything else. Faster jet engines don't lead to higher corn yields, nor do better lasers lead to faster drug discoveries, but faster computer chips lead to all of these. These days all technology follows computer technology. Second, finding inevitability in one key area of technology suggests invariance and directionality may be found in the rest of the technium.
This seminal trend of steadily increasing computing power was first noticed in 1960 by Doug Engelbart, a researcher at Stanford Research Institute (now SRI International) in Palo Alto, California, who would later go on to invent the “windows and mouse” computer interface that is now ubiquitous. When he first started as an engineer, Engelbart worked in the aerospace industry testing airplane models in wind tunnels, where he learned how systematic scaling down led to all kinds of benefits and unexpected consequences. The smaller the model, the better it flew. Engelbart imagined how the benefits of scaling down, or as he called it, “similitude,” might transfer to a new invention SRI was tracking—multiple transistors on one integrated silicon chip. Perhaps as they were made smaller, circuits might deliver a similar kind of magical similitude: the smaller a chip, the better. Engelbart presented his ideas on similitude to an audience of engineers at the 1960 Solid State Circuits Conference that included Gordon Moore, a researcher at Fairchild Semiconductor, a start-up making the integrated chips.
In the following years Moore began tracking the actual statistics of the earliest prototype chips. By 1964 he had enough data points to extrapolate the slope of the curve so far. Moore kept adding data points as the semiconductor industry grew. He was tracking all kinds of parameters—number of transistors made, cost per transistor, number of pins, logic speed, and components per wafer. But one of them was cohering into a nice curve. The trends were saying something no one else was: that the chips would keep getting smaller at a predictable rate. But how far would the trend really go?
BOOK: What Technology Wants
9.01Mb size Format: txt, pdf, ePub
ads

Other books

Far Cry from Kensington by Muriel Spark
Friendship Bread by Darien Gee
El caballero inexistente by Italo Calvino


readsbookonline.com Copyright 2016 - 2024