Read The Fabric of the Cosmos: Space, Time, and the Texture of Reality Online
Authors: Brian Greene
Tags: #Science, #Cosmology, #Popular works, #Astronomy, #Physics, #Universe
Because of its inherently probabilistic nature, quantum mechanics differs sharply from any previous fundamental description of the universe, qualitative or quantitative. Since its inception last century, physicists have struggled to mesh this strange and unexpected framework with the common worldview; the struggle is still very much under way. The problem lies in reconciling the macroscopic experience of day-to-day life with the microscopic reality revealed by quantum mechanics. We are used to living in a world that, while admittedly subject to the vagaries of economic or political happenstance, appears stable and reliable at least as far as its physical properties are concerned. You do not worry that the atomic constituents of the air you are now breathing will suddenly disband, leaving you gasping for breath as they manifest their quantum wavelike character by rematerializing, willy-nilly, on the dark side of the moon. And you are right not to fret about this outcome, because according to quantum mechanics the probability of its happening, while not zero, is absurdly small. But what makes the probability so small?
There are two main reasons. First, on a scale set by atoms, the moon is enormously far away. And, as mentioned, in many circumstances (although by no means all), the quantum equations show that a probability wave typically has an appreciable value in some small region of space and quickly drops nearly to zero as you move away from this region (as in Figure 4.5). So the likelihood that even a single electron that you expect to be in the same room as you—such as one of those that you just exhaled—will be found in a moment or two on the dark side of the moon, while not zero, is extremely small. So small, that it makes the probability that you will marry Nicole Kidman or Antonio Banderas seem enormous by comparison. Second, there are
a lot
of electrons, as well as protons and neutrons, making up the air in your room. The likelihood that
all
of these particles will do what is extremely unlikely even for one is so small that it's hardly worth a moment's thought. It would be like not only marrying your movie-star heartthrob but then also winning every state lottery every week for, well, a length of time that would make the current age of the universe seem a mere cosmic flicker.
This gives some sense of why we do not directly encounter the probabilistic aspects of quantum mechanics in day-to-day life. Nevertheless, because experiments confirm that quantum mechanics does describe fundamental physics, it presents a frontal assault on our basic beliefs as to what constitutes reality. Einstein, in particular, was deeply troubled by the probabilistic character of quantum theory. Physics, he would emphasize again and again, is in the business of determining with certainty what has happened, what is happening, and what will happen in the world around us. Physicists are not bookies, and physics is not the business of calculating odds. But Einstein could not deny that quantum mechanics was enormously successful in explaining and predicting, albeit in a statistical framework, experimental observations of the microworld. And so rather than attempting to show that quantum mechanics was wrong, a task that still looks like a fool's errand in light of its unparalleled successes, Einstein expended much effort on trying to show that quantum mechanics was not the final word on how the universe works. Even though he could not say what it was, Einstein wanted to convince everyone that there was a deeper and less bizarre description of the universe yet to be found.
Over the course of many years, Einstein mounted a series of ever more sophisticated challenges aimed at revealing gaps in the structure of quantum mechanics. One such challenge, raised in 1927 at the Fifth Physical Conference of the Solvay Institute,
8
concerns the fact that even though an electron's probability wave might look like that in Figure 4.5, whenever we measure the electron's whereabouts we always find it at one definite position or another. So, Einstein asked, doesn't that mean that the probability wave is merely a temporary stand-in for a more precise description—one yet to be discovered—that would predict the electron's position with certainty? After all, if the electron is found at X, doesn't that mean, in reality, it was at or very near X a moment before the measurement was carried out? And if so, Einstein prodded, doesn't quantum mechanics' reliance on the probability wave—a wave that, in this example, says the electron had some probability to have been far from X— reflect the theory's inadequacy to describe the true underlying reality?
Einstein's viewpoint is simple and compelling. What could be more natural than to expect a particle to be located at, or, at the very least, near where it's found a moment later? If that's the case, a deeper understanding of physics should provide
that
information and dispense with the coarser framework of probabilities. But the Danish physicist Niels Bohr and his entourage of quantum mechanics defenders disagreed. Such reasoning, they argued, is rooted in conventional thinking, according to which each electron follows a single, definite path as it wanders to and fro. And this thinking is strongly challenged by Figure 4.4, since if each electron did follow one definite path—like the classical image of a bullet fired from a gun—it would be extremely hard to explain the observed interference pattern: what would be interfering with what? Ordinary bullets fired one by one from a single gun certainly can't interfere with each other, so if electrons did travel like bullets, how would we explain the pattern in Figure 4.4?
Instead, according to Bohr and the Copenhagen interpretation of quantum mechanics he forcefully championed,
before one measures the
electron's position there is no sense in even asking where it is.
It does not have a definite position. The probability wave encodes the likelihood that the electron, when examined suitably, will be found here or there, and that
truly
is all that can be said about its position. Period. The electron has a definite position in the usual intuitive sense only at the moment we "look" at it—at the moment when we measure its position—identifying its location with certainty. But before (and after) we do that, all it has are potential positions described by a probability wave that, like any wave, is subject to interference effects. It's not that the electron has a position and that we don't know the position before we do our measurement. Rather, contrary to what you'd expect, the electron simply
does not have
a definite position before the measurement is taken.
This is a radically strange reality. In this view, when we measure the electron's position we are not measuring an objective, preexisting feature of reality. Rather, the act of measurement is deeply enmeshed in creating the very reality it is measuring. Scaling this up from electrons to everyday life, Einstein quipped, "Do you really believe that the moon is not there unless we are looking at it?" The adherents of quantum mechanics responded with a version of the old saw about a tree falling in a forest: if no one is looking at the moon—if no one is "measuring its location by seeing it"—then there is no way for us to know whether it's there, so there is no point in asking the question. Einstein found this deeply unsatisfying. It was wildly at odds with his conception of reality; he firmly believed that the moon is there, whether or not anyone is looking. But the quantum stalwarts were unconvinced.
Einstein's second challenge, raised at the Solvay conference in 1930, followed closely on the first. He described a hypothetical device, which (through a clever combination of a scale, a clock, and a cameralike shutter) seemed to establish that a particle like an electron
must
have definite features—before it is measured or examined—that quantum mechanics said it couldn't. The details are not essential but the resolution is particularly ironic. When Bohr learned of Einstein's challenge, he was knocked back on his heels—at first, he couldn't see a flaw in Einstein's argument. Yet, within days, he bounced back and fully refuted Einstein's claim. And the surprising thing is that the key to Bohr's response was general relativity! Bohr realized that Einstein had failed to take account of his own discovery that gravity warps time—that a clock ticks at a rate dependent on the gravitational field it experiences. When this complication was included, Einstein was forced to admit that his conclusions fell right in line with orthodox quantum theory.
Even though his objections were shot down, Einstein remained deeply uncomfortable with quantum mechanics. In the following years he kept Bohr and his colleagues on their toes, leveling one new challenge after another. His most potent and far-reaching attack focused on something known as the
uncertainty principle,
a direct consequence of quantum mechanics, enunciated in 1927 by Werner Heisenberg.
The uncertainty principle provides a sharp, quantitative measure of how tightly probability is woven into the fabric of a quantum universe. To understand it, think of the prix-fixe menus in certain Chinese restaurants. Dishes are arranged in two columns, A and B, and if, for example, you order the first dish in column A, you are not allowed to order the first dish in column B; if you order the second dish in column A, you are not allowed to order the second dish in column B, and so forth. In this way, the restaurant has set up a dietary dualism, a culinary complementarity (one, in particular, that is designed to prevent you from piling up the most expensive dishes). On the prix-fixe menu you can have Peking Duck or Lobster Cantonese, but not both.
Heisenberg's uncertainty principle is similar. It says, roughly speaking, that the physical features of the microscopic realm (particle positions, velocities, energies, angular momenta, and so on) can be divided into two lists, A and B. And as Heisenberg discovered, knowledge of the first feature from list A fundamentally compromises your ability to have knowledge about the first feature from list B; knowledge of the second feature from list A fundamentally compromises your ability to have knowledge of the second feature from list B; and so on. Moreover, like being allowed a dish containing some Peking Duck and some Lobster Cantonese, but only in proportions that add up to the same total price, the more precise your knowledge of a feature from one list, the less precise your knowledge can possibly be about the corresponding feature from the second list. The fundamental inability to determine simultaneously all features from both lists—to determine with certainty all of these features of the microscopic realm—is the uncertainty revealed by Heisenberg's principle.
As an example, the more precisely you know where a particle is, the less precisely you can possibly know its speed. Similarly, the more precisely you know how fast a particle is moving, the less you can possibly know about where it is. Quantum theory thereby sets up its own duality: you can determine with precision certain physical features of the microscopic realm, but in so doing you eliminate the possibility of precisely determining certain other, complementary features.
To understand why, let's follow a rough description developed by Heisenberg himself, which, while incomplete in particular ways that we will discuss, does give a useful intuitive picture. When we measure the position of any object, we generally interact with it in some manner. If we search for the light switch in a dark room, we know we have located it when we touch it. If a bat is searching for a field mouse, it bounces sonar off its target and interprets the reflected wave. The most common instance of all is locating something by seeing it—by receiving light that has reflected off the object and entered our eyes. The key point is that these interactions not only affect us but also affect the object whose position is being determined. Even light, when bouncing off an object, gives it a tiny push. Now, for day-to-day objects such as the book in your hand or a clock on the wall, the wispy little push of bouncing light has no noticeable effect. But when it strikes a tiny particle like an electron it can have a big effect: as the light bounces off the electron, it changes the electron's speed, much as your own speed is affected by a strong, gusty wind that whips around a street corner. In fact, the more precisely you want to identify the electron's position, the more sharply defined and energetic the light beam must be, yielding an even larger effect on the electron's motion.
This means that if you measure an electron's position with high accuracy, you necessarily contaminate your own experiment: the act of precision position measurement disrupts the electron's velocity. You can therefore know precisely where the electron is, but you cannot also know precisely how fast, at that moment, it was moving. Conversely, you can measure precisely how fast an electron is moving, but in so doing you will contaminate your ability to determine with precision its position. Nature has a built-in limit on the precision with which such complementary features can be determined. And although we are focusing on electrons, the uncertainty principle is completely general: it applies to everything.
In day-to-day life we routinely speak about things like a car passing a particular stop sign (position) while traveling at 90 miles per hour (velocity), blithely specifying these two physical features. In reality, quantum mechanics says that such a statement has no precise meaning since you can't ever simultaneously measure a definite position and a definite speed. The reason we get away with such incorrect descriptions of the physical world is that on everyday scales the amount of uncertainty involved is tiny and generally goes unnoticed. You see, Heisenberg's principle does not just declare uncertainty, it also specifies—with complete certainty—the minimum
amount
of uncertainty in any situation. If we apply his formula to your car's velocity just as it passes a stop sign whose position is known to within a centimeter, then the uncertainty in speed turns out to be just shy of a billionth of a billionth of a billionth of a billionth of a mile per hour. A state trooper would be fully complying with the laws of quantum physics if he asserted that your speed was between 89.99999999999999999999999999999999999 and 90.00000000000000000000000000000000001 miles per hour as you blew past the stop sign; so much for a possible uncertainty-principle defense. But if we were to replace your massive car with a delicate electron whose position we knew to within a billionth of a meter, then the uncertainty in its speed would be a whopping 100,000 miles per hour. Uncertainty is always present, but it becomes significant only on microscopic scales.
The explanation of uncertainty as arising through the unavoidable disturbance caused by the measurement process has provided physicists with a useful intuitive guide as well as a powerful explanatory framework in certain specific situations. However, it can also be misleading. It may give the impression that uncertainty arises only when we lumbering experimenters meddle with things. This is not true. Uncertainty is built into the wave structure of quantum mechanics and exists whether or not we carry out some clumsy measurement. As an example, take a look at a particularly simple probability wave for a particle, the analog of a gently rolling ocean wave, shown in Figure 4.6. Since the peaks are all uniformly moving to the right, you might guess that this wave describes a particle moving with the velocity of the wave peaks; experiments confirm that supposition. But where is the particle? Since the wave is uniformly spread throughout space, there is no way for us to say the electron is
here
or
there.
When measured, it literally could be found anywhere. So, while we know precisely how fast the particle is moving, there is huge uncertainty about its position. And as you see, this conclusion does not depend on our disturbing the particle. We never touched it. Instead, it relies on a basic feature of waves: they can be spread out.
Although the details get more involved, similar reasoning applies to all other wave shapes, so the general lesson is clear. In quantum mechanics, uncertainty just is.
Figure 4.6 A probability wave with a uniform succession of peaks and troughs represents a particle with a definite velocity. But since the peaks and troughs are uniformly spread in space, the particle's position is completely undetermined. It has an equal likelihood of being anywhere.