Throughout the history of science, one of the prime goals of making sense of the Universe has been to discover what’s fundamental. Many of the things we observe and interact with in the modern, macroscopic world are composed of, and can be derived from, smaller particles and the underlying laws that govern them. The idea that everything is made of elements dates back thousands of years, and has taken us from alchemy to chemistry to atoms to subatomic particles to the Standard Model, including the radical concept of a quantum Universe.
But even though there’s very good evidence that all of the fundamental entities in the Universe are quantum at some level, that doesn’t mean that everything is both discrete and quantized. So long as we still don’t fully understand gravity at a quantum level, space and time might still be continuous at a fundamental level. Here’s what we know so far.
Quantum mechanics is the idea that, if you go down to a small enough scale, everything that contains energy, whether it’s massive (like an electron) or massless (like a photon), can be broken down into individual quanta. You can think of these quanta as energy packets, which sometimes behave as particles and other times behave as waves, depending on what they interact with.
Everything in nature obeys the laws of quantum physics, and our “classical” laws that apply to larger, more macroscopic systems can always (at least in theory) be derived, or emerge, from the more fundamental quantum rules. But not everything is necessarily discrete, or capable of being divided into a localized region space.
If you have a conducting band of metal, for example, and ask “where is this electron that occupies the band,” there’s no discreteness there. The electron can be anywhere, continuously, within the band. A free photon can have any wavelength and energy; no discreteness there. Just because something is quantized, or fundamentally quantum in nature, doesn’t mean everything about it must be discrete.
The idea that space (or space and time, since they’re inextricably linked by Einstein’s theories of relativity) could be quantized goes way back to Heisenberg himself. Famous for the Uncertainty Principle, which fundamentally limits how precisely we can measure certain pairs of quantities (like position and momentum), Heisenberg realized that certain quantities diverged, or went to infinity, when you tried to calculate them in quantum field theory.
He noticed that if you postulated a minimum distance scale to space, on the other hand, these infinities would go away. In math/physics speak, the theory became renormalizable, which means we can calculate things sensibly.
You can get an intuitive grasp on this by imagining you have a quantum particle you’ve placed in a box. “Where is the particle,” you ask? Well, you can make a measurement, and you’ll have an uncertainty associated with it: the uncertainty will be proportional to ħ/L, where ħ is the Planck constant and L is the size of the box.
Normally, the uncertainty part (ħ/L) is small compared to the main part itself, but this won’t be the case if L is too small. In fact, if it is, then by adding additional terms that we normally neglect, like (ħ/L)2, we’ll get an even bigger correction. This is why it’s tempting to introduce a “cutoff scale,” or an L that we don’t allow ourselves to go smaller than. This minimum distance scale could save us a lot of headaches in quantum physics.
When you take even non-quantized gravity into account, as shown by physicist Alden Mead in the 1960s, you find that gravity amplifies the uncertainty inherent to position, as set forth by Heisenberg. It becomes impossible to make sense of distances below a length scale known as the Planck length: 10-35 meters. This argument has come up in a new incarnation, in string theories, since the 1990s.
But we don’t have a final theory of gravity, and so we don’t know if this problem is a real, insurmountable one that necessarily implies that space is discrete. Heisenberg’s original difficulty came when he tried to renormalize Fermi’s theory of beta decay; it couldn’t work without a minimum length scale. But since our development of electroweak theory and the Standard Model, we no longer need a discrete, minimum length scale to handle radioactive decay. A better theory can do just fine without it.
So where are we now on the question of whether space and time are quantized? We have three major possibilities, all of which have fascinating implications.
1.) Space and/or time are discrete. Imagine that there’s a shortest-possible length scale. Now what? There’s a problem: in Einstein’s theory of relativity, you can put down an imaginary ruler, anywhere, and it will appear to shorten based on the speed at which you move relative to it. If space were quantized, people moving at different velocities would measure a different fundamental length scale!
That strongly suggests there would be a “privileged” frame of reference, where one particular velocity through space would have the maximum possible length, while all others would be shorter. Not everyone likes this perspective, but it requires you give up something important in physics, like Lorentz invariance or locality. Discretizing time also poses big problems for General Relativity, as John Baez and Bill Unruh have noted.
2.) Space and time are both continuous. It’s possible that the problems that we perceive now, on the other hand, aren’t insurmountable problems, but are rather artifacts of having an incomplete theory of the quantum Universe. It’s possible that space and time are really continuous backgrounds, and even though they’re quantum in nature, they cannot be broken up into fundamental units. It might be a foamy kind of spacetime, with large energy fluctuations on tiny scales, but there might not be a smallest scale. When we do successfully find a quantum theory of gravity, it may have a continuous-but-quantum fabric, after all.
3.) Space and/or time may be either discrete or continuous, but there’s a finite resolution we can achieve. This gets at the heart of the difference between what may be “real” or “fundamental” and what is measurable. Imagine you have a continuous structure, but your ability to view it is what’s limited. When you got down to a certain, small-enough distance scale, it would appear blurred. We might not be able to see whether it’s truly continuous or discrete; we could only tell that we cannot resolve structure below a certain length scale.
Incredibly, there may actually be a way to test whether there is a smallest length scale or not. Three years before he died, physicist Jacob Bekenstein put forth a brilliant idea for an experiment where a single photon would pass through a crystal, causing it to move by a slight amount. Because photons can be tuned in energy (continuously) and crystals can be very massive compared to a photon’s momentum, it ought to be possible to detect whether the “steps” that the crystal moves in are discrete or continuous. With a low-enough energy photon, if space is quantized, the crystal would either move a single quantum step or not at all.
The idea that there could be a smallest possible scale, either in distance or time, is a fascinating one that has puzzled physicists since it was first considered. Sure, everything is quantum, but not everything is discrete. In Einstein’s relativity, space and time are still treated as two linked parts of a continuous fabric. In quantum field theory, spacetime is the continuous stage on which the dance of the quanta takes place. But there ought to be a quantum theory of gravity at the core of it all. The question of “discrete or continuous?” contains some fascinating possibilities, including the possibility that we cannot know below a certain scale. Although many assume one answer or another, at this point, we need more information before we truly know what our Universe is up to at a fundamental level.