What is the difference between a computer and a black hole? This question sounds like the start of a Microsoft joke, but it is one of the most profound problems in physics today. Most people think of computers as specialized gizmos: streamlined boxes sitting on a desk or fingernail-size chips embedded in high-tech coffeepots. But to a physicist, all physical systems are computers. Rocks, atom bombs and galaxies may not run Linux, but they, too, register and process information. Every electron, photon and other elementary particle stores bits of data, and every time two such particles interact, those bits are transformed. Physical existence and information content are inextricably linked. As physicist John A. Wheeler of Princeton University says, It from bit.
Black holes might seem like the exception to the rule that everything computes. Inputting information into them presents no difficulty, but according to Einstein's general theory of relativity, getting information out is impossible. Matter that enters a hole is assimilated, the details of its composition lost irretrievably. In the 1970s Stephen Hawking of the University of Cambridge showed that when quantum mechanics is taken into account, black holes do have an output: they glow like a hot coal. In Hawking's analysis, this radiation is random, however. It carries no information about what went in. If an elephant fell in, an elephant's worth of energy would come out—but the energy would be a hodgepodge that could not be used, even in principle, to re-create the animal.
That apparent loss of information poses a serious conundrum, because the laws of quantum mechanics preserve information. So other scientists, including Leonard Susskind of Stanford University, John Preskill of the California Institute of Technology and Gerard t Hooft of the University of Utrecht in the Netherlands, have argued that the outgoing radiation is not, in fact, random—that it is a processed form of the matter that falls in. In 2004 Hawking came around to their point of view. Black holes, too, compute.
Black holes are merely the most exotic example of the general principle that the universe registers and processes information. The principle itself is not new. In the 19th century the founders of statistical mechanics developed what would later be called information theory to explain the laws of thermodynamics. At first glance, thermodynamics and information theory are worlds apart: one was developed to describe steam engines, the other to optimize communications. Yet the thermodynamic quantity called entropy, which limits the ability of an engine to do useful work, turns out to be proportional to the number of bits registered by the positions and velocities of the molecules in a substance. The invention of quantum mechanics in the 20th century put this discovery on a firm quantitative foundation and introduced scientists to the remarkable concept of quantum information. The bits that make up the universe are quantum bits, or qubits, with far richer properties than ordinary bits.
Analyzing the universe in terms of bits and bytes does not replace analyzing it in conventional terms such as force and energy, but it does uncover new and surprising facts. In the field of statistical mechanics, for example, it unknotted the paradox of Maxwell's demon, a contraption that seemed to allow for perpetual motion. In recent years, we and other physicists have been applying the same insights to cosmology and fundamental physics: the nature of black holes, the fine-scale structure of spacetime, the behavior of cosmic dark energy, the ultimate laws of nature. The universe is not just a giant computer; it is a giant quantum computer. As physicist Paola Zizzi of the University of Padua in Italy says, It from qubit.
WHEN GIGAHERTZ IS TOO SLOW
THE CONFLUENCE of physics and information theory flows from the central maxim of quantum mechanics: at bottom, nature is discrete. A physical system can be described using a finite number of bits. Each particle in the system acts like the logic gate of a computer. Its spin axis can point in one of two directions, thereby encoding a bit, and can flip over, thereby performing a simple computational operation.
The system is also discrete in time. It takes a minimum amount of time to flip a bit. The exact amount is given by a theorem named after two pioneers of the physics of information processing, Norman Margolus of the Massachusetts Institute of Technology and Lev Levitin of Boston University. This theorem is related to the Heisenberg uncertainty principle, which describes the inherent trade-offs in measuring physical quantities, such as position and momentum or time and energy. The theorem says that the time it takes to flip a bit, t, depends on the amount of energy you apply, E. The more energy you apply, the shorter the time can be. Mathematically, the rule is t h/4E, where h is Planck's constant, the main parameter of quantum theory. For example, one type of experimental quantum computer stores bits on protons and uses magnetic fields to flip them. The operations take place in the minimum time allowed by the Margolus-Levitin theorem.
From this theorem, a huge variety of conclusions can be drawn, from limits on the geometry of spacetime to the computational capacity of the universe as a whole. As a warm-up, consider the limits to the computational power of ordinary matter—in this case, one kilogram occupying the volume of one liter. We call this device the ultimate laptop.
Its battery is simply the matter itself, converted directly to energy per Einstein's famous formula E = mc2. Putting all this energy into flipping bits, the computer can do 10 operations per second, slowing down gradually as the energy degrades. The memory capacity of the machine can be calculated using thermodynamics. When one kilogram of matter is converted to energy in a liter volume, its temperature is one billion kelvins. Its entropy, which is proportional to the energy divided by the temperature, corresponds to 1031bits of information. The ultimate laptop stores information in the microscopic motions and positions of the elementary particles zipping around inside it. Every single bit allowed by the laws of thermodynamics is put to use.
Whenever particles interact, they can cause one another to flip. This process can be thought of in terms of a programming language such as C or Java: the particles are the variables, and their interactions are operations such as addition. Each bit can flip 1020 times per second, equivalent to a clock speed of 100 giga-gigahertz. In fact, the system is too fast to be controlled by a central clock. The time it takes a bit to flip is approximately equal to the time it takes a signal to travel from one bit to its neighbor. Thus, the ultimate laptop is highly parallel: it acts not as a single processor but as a vast array of processors, each working almost independently and communicating its results to the others comparatively slowly.
In comparison, a conventional computer flips bits at about 109 times per second, stores about 1012 bits and contains a single processor. If Moore's law could be sustained, your descendants would be able to buy an ultimate laptop midway through the 23rd century. Engineers would have to find a way to exert precise control on the interactions of particles in a plasma hotter than the sun's core, and much of the communications bandwidth would be taken up in controlling the computer and dealing with errors. Engineers would also have to solve some knotty packaging problems.
In a sense, however, you can already purchase such a device, if you know the right people. A one-kilogram chunk of matter converted completely to energy—this is a working definition of a 20-megaton hydrogen bomb. An exploding nuclear weapon is processing a huge amount of information, its input given by its initial configuration and its output given by the radiation it emits.
FROM NANOTECH TO XENNOTECH
IF ANY CHUNK of matter is a computer, a black hole is nothing more or less than a computer compressed to its smallest possible size. As a computer shrinks, the gravitational force that its components exert on one another becomes stronger and eventually grows so intense that no material object can escape. The size of a black hole, called the Schwarzschild radius, is directly proportional to its mass.
A one-kilogram hole has a radius of about 10-27 meter, or one xennometer. (For comparison, a proton has a radius of 10-15 meter.) Shrinking the computer does not change its energy content, so it can perform 1051 operations per second, just as before. What does change is the memory capacity. When gravity is insignificant, the total storage capacity is proportional to the number of particles and thus to the volume. But when gravity dominates, it interconnects the particles, so collectively they are capable of storing less information. The total storage capacity of a black hole is proportional to its surface area. In the 1970s Hawking and Jacob D. Bekenstein of the Hebrew University of Jerusalem calculated that a one-kilogram black hole can register about 1016 bits—much less than the same computer before it was compressed.
In compensation, the black hole is a much faster processor. In fact, the amount of time it takes to flip a bit, 10-35 second, is equal to the amount of time it takes light to move from one side of the computer to the other. Thus, in contrast to the ultimate laptop, which is highly parallel, the black hole is a serial computer. It acts as a single unit.
How would a black hole computer work in practice? Input is not problematic: just encode the data in the form of matter or energy and throw them down the hole. By properly preparing the material that falls in, a hacker should be able to program the hole to perform any desired computation. Once the material enters a hole, it is gone for good; the so-called event horizon demarcates the point of no return. The plummeting particles interact with one another, performing computation for a finite time before reaching the center of the hole—the singularity—and ceasing to exist. What happens to matter as it gets squished together at the singularity depends on the details of quantum gravity, which are as yet unknown.
The output takes the form of Hawking radiation. A one-kilogram hole gives off Hawking radiation and, to conserve energy, decreases in mass, disappearing altogether in a mere 10-21 second. The peak wavelength of the radiation equals the radius of the hole; for a one-kilogram hole, it corresponds to extremely intense gamma rays. A particle detector can capture this radiation and decode it for human consumption.
Hawking's study of the radiation that bears his name is what overturned the conventional wisdom that black holes are objects from which nothing whatsoever can escape. The rate at which black holes radiate is inversely related to their size, so big black holes, such as those at the center of galaxies, lose energy much more slowly than they gobble up matter. In the future, however, experimenters may be able to create tiny holes in particle accelerators, and these holes should explode almost immediately in a burst of radiation. A black hole can be thought of not as a fixed object but as a transient congregation of matter that performs computation at the maximum rate possible.
ESCAPE PLAN
THE REAL QUESTION is whether Hawking radiation returns the answer of the computation or merely gibberish. The issue remains contentious, but most physicists, including Hawking, now think that the radiation is a highly processed version of the information that went into the hole during its formation. Although matter cannot leave the hole, its information content can. Understanding precisely how is one of the liveliest questions in physics right now.
In 2003 Gary Horowitz of the University of California, Santa Barbara, and Juan Maldacena of the Institute for Advanced Study in Princeton, N.J., outlined one possible mechanism. The escape hatch is entanglement, a quantum phenomenon in which the properties of two or more systems remain correlated across the reaches of space and time. Entanglement enables teleportation, in which information is transferred from one particle to another with such fidelity that the particle has effectively been beamed from one location to another at up to the speed of light.
The teleportation procedure, which has been demonstrated in the laboratory, first requires that two particles be entangled. Then a measurement is performed on one of the particles jointly with some matter that contains information to be teleported. The measurement erases the information from its original location, but because of entanglement, that information resides in an encoded form on the second particle, no matter how distant it may be. The information can be decoded using the results of the measurement as the key.
A similar procedure might work for black holes. Pairs of entangled photons materialize at the event horizon. One of the photons flies outward to become the Hawking radiation that an observer sees. The other falls in and hits the singularity together with the matter that formed the hole in the first place. The annihilation of the infalling photon acts as a measurement, transferring the information contained in the matter to the outgoing Hawking radiation.
Other researchers have proposed escape mechanisms that also rely on weird quantum phenomena. In 1996 Andrew Strominger and Cumrun Vafa of Harvard University suggested that black holes are composite bodies made up of multidimensional structures called branes, which arise in string theory. Information falling into the black hole is stored in waves in the branes and can eventually leak out. In 2004 Samir Mathur of Ohio State University and his collaborators modeled a black hole as a giant tangle of strings. This fuzzyball acts as a repository of the information carried by things that fall into the black hole. It emits radiation that reflects this information. Hawking has argued that quantum fluctuations prevent a well-defined event horizon from ever forming. The jury is still out on all these ideas.
CYBERSPACETIME
THE PROPERTIES of black holes are inextricably intertwined with those of spacetime. Thus, if holes can be thought of as computers, so can spacetime itself. Quantum mechanics predicts that spacetime, like other physical systems, is discrete. Distances and time intervals cannot be measured to infinite precision; on small scales, spacetime is bubbly and foamy. The maximum amount of information that can be put into a region of space depends on how small the bits are, and they cannot be smaller than the foamy cells.
Physicists have long assumed that the size of these cells is the Planck length (lP) of 10-35 meter, which is the distance at which both quantum fluctuations and gravitational effects are important. If so, the foamy nature of spacetime will always be too minuscule to observe. But as one of us (Ng) and Hendrik van Dam of the University of North Carolina at Chapel Hill and Frigyes Krolyhzy of Etvs Lornd University in Hungary have shown, the cells are actually much larger and, indeed, have no fixed size: the larger a region of spacetime, the larger its constituent cells. At first, this assertion may seem paradoxical—as though the atoms in an elephant were bigger than those in a mouse. In fact, Lloyd has derived it from the same laws that limit the power of computers.
The process of mapping the geometry of spacetime is a kind of computation, in which distances are gauged by transmitting and processing information. One way to do this is to fill a region of space with a swarm of Global Positioning System satellites, each containing a clock and a radio transmitter. To measure a distance, a satellite sends a signal and times how long it takes to arrive. The precision of the measurement depends on how fast the clocks tick. Ticking is a computational operation, so its maximum rate is given by the Margolus-Levitin theorem: the time between ticks is inversely proportional to the energy.
The energy, in turn, is also limited. If you give the satellites too much energy or pack them too closely together, they will form a black hole and will no longer be able to participate in mapping. (The hole will still emit Hawking radiation, but that radiation has a wavelength the size of the hole itself and so is not useful for mapping features on a finer scale.) The maximum total energy of the constellation of satellites is proportional to the radius of the region being mapped.
Thus, the energy increases more slowly than the volume of the region does. As the region gets bigger, the cartographer faces an unavoidable trade-off: reduce the density of satellites (so they are spaced farther apart) or reduce the energy available to each satellite (so that their clocks tick more slowly). Either way, the measurement becomes less precise. Mathematically, in the time it takes to map a region of radius R, the total number of ticks by all the satellites is R2/lP2. If each satellite ticks precisely once during the mapping process, the satellites are spaced out by an average distance of R1/3lP2/3. Shorter distances can be measured in one subregion but only at the expense of reduced precision in some other subregion. The argument applies even if space is expanding.
This formula gives the precision to which distances can be determined; it is applicable when the measurement apparatus is just on the verge of becoming a black hole. Below the minimum scale, spacetime geometry ceases to exist. That level of precision is much, much bigger than the Planck length. To be sure, it is still very small. The average imprecision in measuring the size of the observable universe is about 10-15 meter. Nevertheless, such an imprecision might be detectable by precise distance-measuring equipment, such as future gravitational-wave observatories.
From a theorist's point of view, the broader significance of this result is that it provides a new way to look at black holes. Ng has shown that the strange scaling of spacetime fluctuations with the cube root of distances provides a back-door way to derive the Bekenstein-Hawking formula for black hole memory. It also implies a universal bound for all black hole computers: the number of bits in the memory is proportional to the square of the computation rate. The proportionality constant is Gh/c5—mathematically demonstrating the linkage between information and the theories of special relativity (whose defining parameter is the speed of light, c), general relativity (the gravitational constant, G) and quantum mechanics (h).
Perhaps most significantly, the result leads directly to the holographic principle, which suggests that our three-dimensional universe is, in some deep but unfathomable way, two-dimensional. The maximum amount of information that any region of space can store seems to be proportional not to its volume but to its surface area. The holographic principle is normally thought to arise from the unknown details of quantum gravity, yet it also follows directly from the fundamental quantum limits to the precision of measurement.
THE ANSWER IS... 42
THE PRINCIPLES of computation can be applied not just to the most compact computers (black holes) and the tiniest possible computers (spacetime foam) but also to the largest: the universe. The universe may well be infinite in extent, but it has existed a finite length of time, at least in its present form. The observable part is currently some tens of billions of light-years across. For us to know the results of a computation, it must have taken place within this expanse.
The above analysis of clock ticks also gives the number of operations that can have occurred in the universe since it began: 10123. Compare this limit with the behavior of the matter around us—the visible matter, the dark matter and the so-called dark energy that is causing the universe to expand at an accelerated rate. The observed cosmic energy density is about 10-9 joule per cubic meter, so the universe contains 1072joules of energy. According to the Margolus-Levitin theorem, it can perform up to 10106 operations per second, for a total of 10123operations during its lifetime so far. In other words, the universe has performed the maximum possible number of operations allowed by the laws of physics.
To calculate the total memory capacity of conventional matter, such as atoms, one can apply the standard methods of statistical mechanics and cosmology. Matter can embody the most information when it is converted to energetic, massless particles, such as neutrinos or photons, whose entropy density is proportional to the cube of their temperature. The energy density of the particles (which determines the number of operations they can perform) goes as the fourth power of their temperature. Therefore, the total number of bits is just the number of operations raised to the three-fourths power. For the whole universe, that amounts to 1092 bits. If the particles contain some internal structure, the number of bits might be somewhat higher. These bits flip faster than they intercommunicate, so the conventional matter is a highly parallel computer, like the ultimate laptop and unlike the black hole.
As for dark energy, physicists do not know what it is, let alone how to calculate how much information it can store. But the holographic principle implies that the universe can store a maximum of 10123 bits—nearly the same as the total number of operations. This approximate equality is not a coincidence. Our universe is close to its critical density. If it had been slightly more dense, it might have undergone gravitational collapse, just like the matter falling into a black hole. So it meets (or nearly meets) the conditions for maxing out the number of computations. That maximum number is R2/lP2, which is the same as the number of bits given by the holographic principle. At each epoch in its history, the maximum number of bits that the universe can contain is approximately equal to the number of operations it could have performed up to that moment.
Whereas ordinary matter undergoes a huge number of operations, dark energy behaves quite differently. If it encodes the maximum number of bits allowed by the holographic principle, then the overwhelming majority of those bits have had time to flip no more than once over the course of cosmic history. So these unconventional bits are mere spectators to the computations performed at much higher speeds by the smaller number of conventional bits. Whatever the dark energy is, it is not doing very much computation. It does not have to. Supplying the missing mass of the universe and accelerating its expansion are simple tasks, computationally speaking.
What is the universe computing? As far as we can tell, it is not producing a single answer to a single question, like the giant Deep Thought computer in the science-fiction classic The Hitchhiker's Guide to the Galaxy. Instead the universe is computing itself. Powered by Standard Model software, the universe computes quantum fields, chemicals, bacteria, human beings, stars and galaxies. As it computes, it maps out its own spacetime geometry to the ultimate precision allowed by the laws of physics. Computation is existence.
These results spanning ordinary computers, black holes, spacetime foam and cosmology are testimony to the unity of nature. They demonstrate the conceptual interconnections of fundamental physics. Although physicists do not yet possess a full theory of quantum gravity, whatever that theory is, they know it is intimately connected with quantum information. It from qubit.
THE AUTHOR
SETH LLOYD and Y. JACK NG bridge the two most exciting fields of theoretical physics: quantum information theory and the quantum theory of gravity. Lloyd, professor of quantum-mechanical engineering at the Massachusetts Institute of Technology, designed the first feasible quantum computer. He works with various teams to construct and operate quantum computers and communications systems. Ng, professor of physics at the University of North Carolina at Chapel Hill, studies the fundamental nature of spacetime. He has proposed various ways to look for the quantum structure of spacetime experimentally. Both researchers say their most skeptical audience is their family. When Lloyd told his daughters that everything is made of bits, one responded bluntly: Youre wrong, Daddy. Everything is made of atoms, except light. Ng has lost credibility on the subject because he is always having to turn to his sons for help with his computer.
No comments:
Post a Comment