THE INEVITABILITY OF COMPUTATION
It is not a bad definition of man to describe him as a tool-making animal. His earliest contrivances to support uncivilized life were tools of the simplest and rudest construction. His latest achievements in the substitution of machinery, not merely for the skill of the human hand, but for the relief of the human intellect, are founded on the use of tools of a still higher order.
—Charles Babbage
All of the fundamental processes we have examined—the development of the Universe, the evolution of life-forms, the subsequent evolution of technology—have all progressed in an exponential fashion, some slowing down, some speeding up. What is the common thread here? Why did cosmology exponentially slow down while evolution accelerated? The answers are surprising, and fundamental to understanding the twenty-first century.
But before I attempt to answer these questions, let’s examine one other very relevant example of acceleration: the exponential growth of computation.
Early in the evolution of life-forms, specialized organs developed the ability to maintain internal states and respond differentially to external stimuli. The trend ever since has been toward more complex and capable nervous systems with the ability to store extensive memories; recognize patterns in visual, auditory, and tactile stimuli; and engage in increasingly sophisticated levels of reasoning. The ability to remember and to solve problems—computation—has constituted the cutting edge in the evolution of multicellular organisms.
The same value of computation holds true in the evolution of human-created technology. Products are more useful if they can maintain internal states and respond differentially to varying conditions and situations. As machines moved beyond mere implements to extend human reach and strength, they also began to accumulate the ability to remember and perform logical manipulations. The simple cams, gears, and levers of the Middle Ages were assembled into the elaborate automata of the European Renaissance. Mechanical calculators, which first emerged in the seventeenth century, became increasingly complex, culminating in the first automated U.S. census in 1890. Computers played a crucial role in at least one theater of the Second World War, and have developed in an accelerating spiral ever since.
THE LIFE CYCLE OF A TECHNOLOGY
Technologies fight for survival, evolve, and undergo their own characteristic life cycle. We can identify seven distinct stages. During the
precursor
stage, the prerequisites of a technology exist, and dreamers may contemplate these elements coming together. We do not, however, regard dreaming to be the same as inventing, even if the dreams are written down. Leonardo da Vinci drew convincing pictures of airplanes and automobiles, but he is not considered to have invented either.
The next stage, one highly celebrated in our culture, is
invention,
a very brief stage, not dissimilar in some respects to the process of birth after an extended period of labor. Here the inventor blends curiosity, scientific skills, determination, and usually a measure of showmanship to combine methods in a new way to bring a new technology to life.
The next stage is
development,
during which the invention is protected and supported by doting guardians (which may include the original inventor). Often this stage is more crucial than invention and may involve additional creation that can have greater significance than the original invention. Many tinkerers had constructed finely hand-tuned horseless carriages, but it was Henry Ford’s innovation of mass production that enabled the automobile to take root and flourish.
The fourth stage is
maturity.
Although continuing to evolve, the technology now has a life of its own and has become an independent and established part of the community. It may become so interwoven in the fabric of life that it appears to many observers that it will last forever. This creates an interesting drama when the next stage arrives, which I call the stage of the
pretenders.
Here an upstart threatens to eclipse the older technology. Its enthusiasts prematurely predict victory. While providing some distinct benefits, the newer technology is found on reflection to be missing some key element of functionality or quality. When it indeed fails to dislodge the established order, the technology conservatives take this as evidence that the original approach will indeed live forever.
This is usually a short-lived victory for the aging technology. Shortly thereafter, another new technology typically does succeed in rendering the original technology into the stage of obsolescence. In this part of the life cycle, the technology lives out its senior years in gradual decline, its original purpose and functionality now subsumed by a more spry competitor. This stage, which may comprise 5 to 10 percent of the life cycle, finally yields to
antiquity
(examples today: the horse and buggy, the harpsichord, the manual typewriter, and the electromechanical calculator).
To illustrate this, consider the phonograph record. In the mid-nineteenth century, there were several precursors, including Édouard-Léon Scott de Martinville’s phonautograph, a device that recorded sound vibrations as a printed pattern. It was Thomas Edison, however, who in 1877 brought all of the elements together and invented the first device that could record and reproduce sound. Further refinements were necessary for the phonograph to become commercially viable. It became a fully mature technology in 1948 when Columbia introduced the 33 revolutions-per-minute (rpm) long-playing record (LP) and RCA Victor introduced the 45-rpm small disc. The pretender was the cassette tape, introduced in the 1960s and popularized during the 1970s. Early enthusiasts predicted that its small size and ability to be rerecorded would make the relatively bulky and scatchable record obsolete.
Despite these obvious benefits, cassettes lack random access (the ability to play selections in a desired order) and are prone to their to their own forms of distortion and lack of fidelity. In the late 1980s and early 1990, the digital compact disc (CD) did deliver the mortal blow. With the CD providing both random access and a level of quality close to the limits of the human auditory system, the phonograph record entered the stage of obsolescence in the first half of the 1990s. Although still produced in small quantities, the technology that Edison gave birth to more than a century ago is now approaching antiquity.
Another example is the print book, a rather mature technojbgy tpday. It is now in the stage of the pretenders, with the software-based “virtual” book as the pretender. Lacking the resolution, contrast, lack of flicker, and other visual qualities of paper and ink, the current generation of virtual book does not have the capability of displacing paper-based publications. Yet this victory of the paper-based book will be short-lived as future generations of computer displays succeed in providing a fully satisfactory alternative to paper.
The Emergence of Moore’s Law
Gordon Moore, an inventor of the integrated circuit and then chairman of Intel, noted in 1965 that the surface area of a transistor (as etched on an integrated circuit) was being reduced by approximately 50 percent every twelve months. In 1975, he was widely reported to have revised this observation to eighteen months. Moore claims that his 1975 update was to twenty-four months, and that does appear to be a better fit to the data.
MOORE’S LAW AT WORK
The result is that every two years, you can pack twice as many transistors on an integrated circuit. This doubles both the number of components on a chip as well as its speed. Since the cost of an integrated circuit is fairly constant, the implication is that every two years you can get twice as much circuitry running at twice the speed for the same price. For many applications, that’s an effective quadrupling of the value. The observation holds true for every type of circuit, from memory chips to computer processors.
This insightful observation has become known as Moore’s Law on Integrated Circuits, and the remarkable phenomenon of the law has been driving the acceleration of computing for the past forty years. But how much longer can this go on? The chip companies have expressed confidence in another fifteen to twenty years of Moore’s Law by continuing their practice of using increasingly higher resolutions of optical lithography (an electronic process similar to photographic printing) to reduce the feature size—measured today in millionths of a meter—of transistors and other key components.
18
But then—after almost sixty years—this paradigm will break down. The transistor insulators will then be just a few atoms thick, and the conventional approach of shrinking them won’t work.
What then?
We first note that the exponential growth of computing did not start with Moore’s Law on Integrated Circuits. In the accompanying figure, “The Exponential Growth of Computing, 1900-1998,”
19
I plotted forty-nine notable computing machines spanning the twentieth century on an exponential chart, in which the vertical axis represents powers of ten in computer speed per unit cost (as measured in the number of “calculations per second” that can be purchased for $1,000). Each point on the graph represents one of the machines. The first five machines used mechanical technology, followed by three electromechanical (relay based) computers, followed by eleven vacuum-tube machines, followed by twelve machines using discrete transistors. Only the last eighteen computers used integrated circuits.
I then fit a curve to the points called a fourth-order polynomial, which allows for up to four bends. In other words, I did not try to fit a straight line to the points, just the closest fourth-order curve. Yet a straight line is close to what I got. A straight line on an exponential graph means exponential growth. A careful examination of the trend shows that the curve is actually bending slightly upward, indicating a small exponential growth in the rate of exponential growth. This may result from the interaction of two different exponential trends, as I will discuss in chapter 6, “Building New Brains.” Or there may indeed be two levels of exponential growth. Yet even if we take the more conservative view that there is just one level of acceleration, we can see that the exponential growth of computing did not start with Moore’s Law on Integrated Circuits, but dates back to the advent of electrical computing at the beginning of the twentieth century
Mechanical Computing Devices
Electromechanical (Relay Based) Computers