Read The Age of Spiritual Machines: When Computers Exceed Human Intelligence Online

Authors: Ray Kurzweil

Tags: #Non-Fiction, #Fringe Science, #Amazon.com, #Retail, #Science

The Age of Spiritual Machines: When Computers Exceed Human Intelligence (7 page)

Let’s see how the Law of Time and Chaos applies to our examples. If chaos is increasing, the Law of Time and Chaos implies the following sublaw:
The Law of Increasing Chaos: As chaos exponentially increases, time exponentially slows down (that is, the time interval between salient events grows longer as time passes).
 
This fits the Universe rather well. When the entire Universe was just a “naked” singularity—a perfectly orderly single point in space and time—there was no chaos and conspicuous events took almost no time at all. As the Universe grew in size, chaos increased exponentially, and so did the timescale for epochal changes. Now, with billions of galaxies sprawled out over trillions of light-years of space, the Universe contains vast reaches of chaos, and indeed requires billions of years to get everything organized for a paradigm shift to take place.
We see a similar phenomenon in the progression of an organisms life. We start out as a single fertilized cell, so there’s only rather limited chaos there. Ending up with trillions of cells, chaos greatly expands. Finally, at the end of our lives, our designs deteriorate, engendering even greater randomness. So the time period between salient biological events grows longer as we grow older. And that is indeed what we experience.
But it is the opposite spiral of the Law of Time and Chaos that is the most important and relevant for our purposes. Consider the inverse sublaw, which I call the Law of Accelerating Returns:
The Law of Accelerating Returns: As order exponentially increases, time exponentially speeds up (that is, the time interval between salient events grows shorter as time passes).
 
The Law of Accelerating Returns (to distinguish it from a better-known law in which returns diminish) applies specifically to evolutionary processes. In an evolutionary process, it is order—the opposite of chaos—that is increasing. And, as we have seen, time speeds up.
Disdisorder
 
I noted above that the concept of chaos in the Law of Time and Chaos is tricky Chaos alone is not sufficient—disorder for our purposes requires randomness that is relevant to the process we are concerned with. The opposite of disorder—which I called “order” in the above Law of Accelerating Returns—is even trickier.
Let’s start with our definition of disorder and work backward. If disorder represents a random sequence of events, then the opposite of disorder should imply “not random.” And if random means unpredictable, then we might conclude that order means predictable. But that would be wrong.
Borrowing a page from information theory,
21
consider the difference between information and noise. Information is a sequence of data that is meaningful in a process, such as the DNA code of an organism, or the bits in a computer program. Noise, on the other hand, is a random sequence.
Neither noise nor information is predictable.
Noise is inherently unpredictable, but carries no information. Information, however, is also unpredictable. If we can predict future data from past data, then that future data stops being information. For example, consider a sequence which simply alternates between zero and one (01010101 ...). Such a sequence is certainly orderly, and very predictable. Specifically because it is so predictable, we do not consider it information bearing, beyond the first couple of bits.
Thus orderliness does not constitute order because order requires information. So, perhaps I should use the word information instead of
order.
However, information alone is not sufficient for our purposes either. Consider a phone book. It certainly represents a lot of information, and some order as well. Yet if we double the size of the phone book, we have increased the amount of data, but we have not achieved a deeper level of order.
Order, then, is information that fits a purpose.
The measure of order is the measure of how well the information fits the purpose. In the evolution of life-forms, the purpose is to survive. In an evolutionary algorithm (a computer program that simulates evolution to solve a problem) applied to, say, investing in the stock market, the purpose is to make money. Simply having more information does not necessarily result in a better fit. A superior solution for a purpose may very well involve less data.
The concept of “complexity” has been used recently to describe the nature of the information created by an evolutionary process. Complexity is a reasonably close fit to the concept of order that I am describing. After all, the designs created by the evolution of life-forms on Earth appear to have become more complex over time. However, complexity is not a perfect fit, either. Sometimes, a deeper order—a better fit to a purpose—is achieved through simplification rather than further increases in complexity. As Einstein said, “Everything should be made as simple as possible, but no simpler.” For example, a new theory that ties together apparently disparate ideas into one broader, more coherent theory reduces complexity but nonetheless may increase the “order for a purpose” that I am describing. Evolution has shown, however, that the general trend toward greater order does generally result in greater complexity.
22
Thus improving a solution to a problem—which may increase or decrease complexity—increases order. Now that just leaves the issue of defining the problem. And as we will see, defining a problem well is often the key to finding its solution.
The Law of Increasing Entropy Versus the Growth of Order
 
Another consideration is how the Law of Time and Chaos relates to the second law of thermodynamics. Unlike the second law, the Law of Time and Chaos is not necessarily concerned with a closed system. It deals instead with a process. The Universe is a closed system (not subject to outside influence, since there is nothing outside the Universe), so in accordance with the second law of thermodynamics, chaos increases and time slows down. In contrast, evolution is precisely not a closed system. It takes place amid great chaos, and indeed
depends on the disorder in its midst,from which it draws its options for diversity.
And from these options, an evolutionary process continually prunes its choices to create ever greater order. Even a crisis that appears to introduce a significant new source of chaos is likely to end up increasing—deepening—the order created by an evolutionary process. For example, consider the asteroid that is thought to have killed off big organisms such as the dinosaurs 65 million years ago. The crash of that asteroid suddenly created a vast increase in chaos (and lots of dust, too). Yet it appears to have hastened the rise of mammals in the niche previously dominated by large reptiles and ultimately led to the emergence of a technology-creating species. When the dust settled (literally), the crisis of the asteroid had increased order.
As I pointed out earlier, only a tiny fraction of the. stuff in the Universe, or even on a life- and technology-bearing planet such as Earth, can be considered to be part of evolution’s inventions. Thus evolution does not contradict the Law of Increasing Entropy. Indeed, it depends on it to provide a never-ending supply of options.
As I noted, given the emergence of life, the emergence of a technology-creating species—and of technology—is inevitable. Technology is the continuation of evolution by other means, and is itself an evolutionary process. So it, too, speeds up.
A primary reason that evolution—of life-forms or of technology—speeds up is that
it builds on its own increasing order.
Innovations created by evolution encourage and enable faster evolution. In the case of the evolution of life-forms, the most notable example is DNA, which provides a recorded and protected transcription of life’s design from which to launch further experiments.
In the case of the evolution of technology, ever improving human methods of recording information have fostered further technology. The first computers were designed on paper and assembled by hand. Today, they are designed on computer workstations with the computers themselves working out many details of the next generation’s design, and are then produced in fully automated factories with human guidance but limited direct intervention.
The evolutionary process of technology seeks to improve capabilities in an exponential fashion. Innovators seek to improve things by multiples. Innovation is multiplicative, not additive. Technology, like any evolutionary process, builds on itself. This aspect will continue to accelerate when the technology itself takes full control of its own progression.
We can thus conclude the following with regard to the evolution of life-forms, and of technology:
The Law of Accelerating Returns as Applied to an Evolutionary Process:

An evolutionary process is not a closed system; therefore, evolution draws upon the chaos in the larger system in which it takes place for its options for diversity; and

Evolution builds on its own increasing order.
Therefore:

In an evolutionary process, order increases exponentially.
Therefore:
▲ Time exponentially speeds up.
Therefore:
▲ The returns (that is, the valuable products of the process) accelerate.
 
The phenomenon of time slowing down and speeding up is occurring simultaneously. Cosmologically speaking, the Universe continues to slow down. Evolution, now most noticeably in the form of human-created technology, continues to speed up. These are the two sides—two interleaved spirals—of the Law of Time and Chaos.
The spiral we are most interested in—the Law of Accelerating Returns—gives us ever greater order in technology, which inevitably leads to the emergence of computation. Computation is the essence of order. It provides the ability for a technology to respond in a variable and appropriate manner to its environment to carry out its mission. Thus computational technology is also an evolutionary process, and also builds on its own progress. The time to accomplish a fixed objective gets exponentially shorter over time (for example, ninety years for the first MIP per thousand dollars versus one day for an additional MIP today). That the power of computing grows exponentially over time is just another way to say the same thing.
So Where Does That Leave Moore’s Law?
 
Well, it still leaves it dead by the year 2020. Moore’s Law came along in 1958 just when it was needed and will have done its sixty years of service by 2018, a rather long period of time for a paradigm nowadays. Unlike Moore’s Law, however, the Law of Accelerating Returns is not a temporary methodology. It is a basic attribute of the nature of time and chaos-a sublaw of the Law of Time and Chaos—and describes a wide range of apparently divergent phenomena and trends. In accordance with the Law of Accelerating Returns, another computational technology will pick up where Moore’s Law will have left off, without missing a beat.
Most Exponential Trends Hit a Wall ... but Not This One
 
A frequent criticism of predictions of the future is that they rely on mindless extrapolation of current trends without consideration of forces that may terminate or alter that trend. This criticism is particularly relevant in the case of exponential trends. A classic example is a species happening upon a hospitable new habitat, perhaps transplanted there by human intervention (rabbits in Australia, say). Its numbers multiply exponentially for a while, but this phenomenon is quickly terminated when the exploding population runs into a new predator or the limits of its environment. Similarly, the geometric population growth of our own species has been a source of anxiety, but changing social and economic factors, including growing prosperity, have greatly slowed this expansion in recent years, even in developing countries.
 
THE LEARNING CURVE: SLUG VERSUS HUMAN
 
The “learning curve” describes the mastery of a skill over time. As an entity-slug or human-learns a new skill, the newly acquired ability builds on itself, and so the learning curve starts out looking like the exponential growth we see in the Law of Accelerating Returns. Skills tend to be bunded, so as the new expertise is mastered, the law of diminishing returns sets in, and growth in mastery levels off. So the learning curve is what we call an S curve because exponential growth followed by a leveling off looks like an S leaning slightly to the right:
The learning curve is remarkably universal: Most multicellular creatures do it. Slugs, for example, follow the learning curve when learning how to ascend a new tree in search of leaves. Humans, of course, are always learning something new.
But there’s a salient difference between humans and slugs. Humans are capable of innovation, which is the creation and retention of new skills and knowledge. Innovation is the driving force in the Law of Accderating Returns, and eliminates the leveling-off part of the S curve. So innovation turns the S curve into indefinite exponential expansion.
Overcoming the S curve is another way to express the unique status of the human species. No other species appears to do this. Why are we unique in this way, given that other primates are so close to us in terms of genetic similarity?
The reason is that the ability to overcome the S curve defines a new ecological niche. As I pointed out, there were indeed other humanoid species and subspecies capable of innovation, but the niche seems to have tolerated only one surviving competitor. But we will have company in the twenty-first century as our machines join us in this exclusive niche.
 

Other books

Bitter of Tongue by Cassandra Clare, Sarah Rees Brennan
Power by King, Joy Deja
The Golden Barbarian by Iris Johansen
The Lucky Kind by Alyssa B. Sheinmel
NFL Draft 2014 Preview by Nawrocki, Nolan
Z-Minus (Book 4) by Briar, Perrin
Recalculating by Jennifer Weiner
Ruptured: The Cantati Chronicles by Gallagher, Maggie Mae


readsbookonline.com Copyright 2016 - 2024