The cosmos

What is the universe like, in its most basic and fundamental aspects? Well, we all know that the way material things behave, on the most microscopic levels, is odd, to say the least. Quantum mechanics tells us that the smallest things that make up the matter we see around us sometimes act like tiny particles with well-defined locations. At other times, though, they act like some kind of wave phenomenon, spread out in space. Moreover, they behave unpredictably, sometimes making “quantum jumps” from one state to the other.

But setting matter and how it behaves to one side for a moment, you may not have heard that the quantum revolution has fundamentally changed our understanding of space and time, so you continue to live in the comfortable, everyday world of medium-sized objects, spread out in space with well-defined positions and states of motion, existing over time and typically changing their locations (in space, and relative to one another) as time passes. Moreover, the spatial distance between things matters: objects can only affect each other by direct contact; or, if they can affect one another “at a distance”, via gravity, for instance, or electromagnetic fields such as light, then spatial separation strongly limits such effects. The farther away two things are from each other, the less they can affect one another. Extremely distant objects have essentially no effect on what happens in the here and now.

This is the way we thought about our universe in the 19th century and earlier; and it is still the way we understand it as we go about our day-to-day lives. In fact, the main difference in the way we think of the universe today, compared to the way an educated European thought of it in the wake of Galileo and Newton, is probably in terms of its sheer immensity: we now know that our Sun and planets occupy a miniscule portion of an immense, lenticular galaxy (the Milky Way), which itself is just one of hundreds of billions of galaxies we can see in the part of the universe that is visible to us.

You probably know that Einstein’s special and general relativity theories (1905 and 1915, respectively) overthrew our common-sense, classical understanding of space and time. In classical physics – and in our common-sense way of thinking about it, even today – space and time are quite different things. Space is understood, metaphorically at least, as a container in which every physical thing occupies a place. Time is something that inexorably passes, for everybody everywhere in exactly the same way: minute by minute, day by day, we move further and further into what is, for us now, the future.

Einstein’s relativity theories brought some radical changes to our conceptions of space and time. Time ceased to be an absolute, passing in the same way for everybody in the universe, becoming instead something that depends upon one’s state of motion

Einstein’s relativity theories brought some radical changes to our conceptions of space and time. Some physicists and philosophers have even argued that relativity theories force us to give up on the idea that time passes

Some physicists and philosophers have even argued that relativity theories force us to give up on the idea that time passes at all. The well-known slogan is that, after Einstein, time became no more than a fourth dimension of reality alongside the three spatial dimensions, albeit a dimension that differs from the spatial dimensions in important ways. Finally, according to general relativity theory, even space was radically changed: instead of a homogeneous container in which the laws of Euclidean geometry hold everywhere, space became “curved”, variably bent and stretched by the gravitational effects of massive bodies. And not only space is warped in general relativity: time itself may become so curved by intense gravity fields that time loops back on itself. Time travel into the past becomes a physical possibility, as depicted in the 2014 movie Interstellar.

Locality

All these revisions to our everyday conceptions of space and time are fascinating and radical, and it is a shame we cannot explore them here. But I mention them because, as radical as they were, Einstein’s innovations did not change our ability to think of the world as consisting of three-dimensional objects, spread around in an immense container (space), and existing through time. Nor did relativity theories undermine our belief that spatially distant objects barely affect what happens in the here and now. On the contrary, they reinforced that belief. According to Einstein’s theories, spatially separated objects can only influence each other via the propagation of some material body or field between them, and that influence can never propagate faster than the speed of light (contrasting sharply with Newton’s theory of gravity as an instantaneous force that diminished with distance in proportion to the inverse of the square of the distance between the objects). This Einsteinian prohibition of influences that are instantaneous, or that propagate faster than the speed of light (c), came to be considered a fundamental bedrock of physics in the 20th century, and is often known as the principle of locality or Einstein locality (hereinafter referred to as “locality”). Einstein himself thought that locality was so fundamental to the physical sciences that we could barely make progress in physics were we to give it up. [1]1 — Einstein thought that, if arbitrarily distant events could influence events in the here and now, with a strength of impact not diminished by distance, then it would never be possible to rule out any hypothesis dreamed up to explain what happens here and now partly in terms of things happening on the other side of the globe, or even in distant galaxies. He therefore believed that the principle of locality was needed in order for experimenters to be able to isolate and control the factors that determine the outcomes of their experiments.

Given that Einstein was one of the early founders of quantum mechanics, it is ironic that one of the most significant but least well-known ways in which quantum physics revolutionises our understanding of the physical world is precisely that it overthrows locality! Einstein saw this very early on, and it was the main reason for his opposition to the quantum theory developed in the 1920s. Einstein could not accept the “spooky action at a distance” that quantum theory seemed to postulate, and for many years he searched for an alternative theory, consistent with known quantum phenomena, that would preserve locality.

What Einstein did not realise – what nobody realised, until the seminal works of John Bell in the 1960s – is that it would turn out to be possible to definitively establish, in the laboratory, whether nature displays failures of locality. The 2022 Nobel Prize in Physics was awarded to experimental physicists who established, through a series of experiments carried out between the 1970s and early 2000s, that nature does violate locality, just as quantum theory predicts. Bell might well have been a co-recipient of the Nobel had his demise in 1990 not rendered him ineligible.

In order to understand all this better, we need to back up a bit and discuss how quantum mechanics predicts non-local phenomena.

Quantum non-locality

According to quantum mechanics, particles that have interacted in the past, or which share a common origin (for example, pairs of photons generated in a collision of heavier particles), have states that are entangled, i.e., the properties of one particle are related to – coordinated with – the properties of the other particle. Importantly, this connection remains a feature of both particles, no matter how far apart they may travel, until (perhaps) some interaction with a third system occurs to break their entanglement. The connection does not diminish with distance. [2]2 — Entanglement has become an important tool and resource for the burgeoning fields of quantum computing and quantum cryptography, as described in other articles in this issue.

For example, two photons may be emitted by an atom in a high-energy state, in such a way that one will be detected on one side of a laboratory, and the other on the opposite side. The spins of the photons (spin is a kind of intrinsic angular momentum possessed by particles) are represented in the quantum state as indeterminate: they possess no definite value prior to measurement, only probabilities for manifesting various values. But the values are inexorably linked: once one photon is detected and measured on one side of the laboratory, it becomes certain that, if the partner photon is measured in the same way, on the other side of the laboratory, a specific definite result will be obtained. In the language sometimes used to speak of quantum measurements, the measurement of one photon’s spin “collapses” the state of both photons into having definite values. In a sense, to measure one of the pair is to measure both of them – even if they are separated by miles, or millions of miles. Spin measurements are always made using devices that are oriented along a certain spatial axis, as shown in the figure. Entangled particles in a quantum “singlet” state will always give opposite results, if measured by apparatuses oriented in the same direction, regardless of the direction chosen. Later on we will see what happens when the apparatuses are not oriented in exactly the same direction.

Graphic representation of the Bell – EPR experiment



Thinking about these connections between measurement results, it is natural to suspect that all the measurements are actually doing is revealing properties that the quantum particles already possessed. These pre-existing properties are sometimes called “hidden variables”, because they are not found in the quantum mechanical description of the pair of particles. This idea, then, involves an assumption that the quantum mechanical description of the particles (which, you will recall, described them as indeterminate with respect to such properties as position, momentum or spin) is incomplete. Hence quantum mechanics would be merely a useful theory for making statistical predictions, but not a theory that directly and completely describes microscopic reality. This is precisely what Einstein maintained. A famous paper co-authored by Einstein, Nathan Rosen and Boris Podolsky in 1935, now usually known as the EPR paper, was titled “Can quantum-mechanical description of physical reality be considered complete?”.

Bell’s insight

So we have two ways of thinking of entangled particles. According to quantum mechanics, they do not possess definite properties of, say, spin along a given axis, prior to measurement; but measurement of the spin of one particle instantly makes it the case that measurement of the partner particle will yield the opposite result, if the measurements are made by apparatuses pointing in the same direction. For measurements where, say, one particle’s spin is measured in the x-direction, while the partner particle’s is measured along an axis 𝛳 degrees away from x, quantum mechanics predicts how anti-correlated the results will be, as a function of 𝛳. According to Einstein’s way of thinking, from the moment of separation, each particle possesses definite properties that will determine the result of the measurement, whether measured along the same axis or along different axes.

Bell’s brilliant insight in the early 1960s was that, if Einstein’s perspective were correct, then there were upper limits to how much correlation could be observed between measurements made at various angles 𝛳. In fact, with a simple but ingenious argument, Bell proved that, for a large number of spin measurements on opposite sides of a laboratory, made at one of three axis angles (we shall call them 1, 2, and 3, the angles to measure on a given run to be determined by a randomising procedure such as flipping a coin), there is a mathematical limit to the sum of the correlations for measurements at angles (1, 2), (1, 3) and (2, 3). [3]3 — For example, (1,2) means that the particle arriving on the left side of the lab is measured along axis 1, and the particle on the right side of the lab is measured along axis 2. This limit is imposed strictly by logic, assuming that the particles cannot directly influence each other’s measurement outcomes; rather, that the measurement results are determined purely by whatever properties the particle already has when it enters the measurement apparatus. This assumption, of course, is precisely the principle of locality.

According to quantum mechanics, particles that have interacted in the past, or which share a common origin have states that are entangled. The connection does not diminish with distance

What Bell proved, then, is that, if locality is true, there is an upper limit to the sums of correlations for measurements made by spatially separated apparatuses. The interesting thing is that quantum mechanics predicts sums of correlations that exceed Bell’s upper limit. How can this be? Well, as we have already said, quantum mechanics predicts that measuring one particle instantaneously makes it the case that a measurement on the other particle will either give the opposite result (if measured along the same axis); or will have a certain probability of giving the opposite result if measured at an angle differing by 𝛳 degrees, a probability that is different from the probability assigned before the first particle was measured. In other words, the predictions of quantum mechanics themselves violate locality, hence Bell’s limit does not apply.

Locality fails in the laboratory

What decades of experiments have shown is that quantum mechanical predictions are satisfied and Bell’s limit is exceeded by the measurement results nature gives us. But what are we really saying when we say that the measurement result on one particle instantaneously changes the probabilities of the possible results on the other particle? In the formalism of standard, non-relativistic quantum mechanics, it means exactly what it says: there is an instantaneous change in the particle despite its spatial separation (which, theoretically, can be as large as we like), much as Newtonian gravity was supposed to act instantaneously. But the effects of measuring one particle do not have to travel or propagate from the first particle to the second; nor do those effects diminish in any way as the distance between the particles increases, unlike the force of gravity. Experiments carried out in 1982 by Alain Aspect, one of the 2022 Nobel laureates, established that the effect is indeed instantaneous, or at least that it travels faster than the speed of light (if we assume, contrary to what quantum mechanics seems to say, that the effect must somehow travel from one side of the lab to the other). Aspect measured the photons on each side of his laboratory at precisely the same time, to within an accuracy of timing that ruled out any signal at light speed (or slower) being passed from one particle’s measurement to the other’s. [4]4 — Here and above, when I say “instantaneously” or “at the same time”, I mean “according to the reference frame attached to the laboratory”. Relativity theory tells us that observers passing by in fast-moving spacecraft would disagree as to which particle was measured first – this being the famous “relativity of simultaneity” in Einstein’s theories. But any and all observers, no matter how fast they are moving, would agree that any influence propagating from the first measurement to the second would need to travel faster than c.

This apparent violation of locality is, in a sense, shocking, and some physicists have tried to reject it by questioning Bell’s reasoning or rejecting some other tacit premise of Bell’s argument rather than locality. Since we cannot go into all the logical possibilities here, the situation in brief is this: there remain only two ways to restore locality in the face of the experimental facts. The first of these is known as “superdeterminism” and involves postulating correlations between the hidden properties possessed by the particles before they are measured and the choices of measurement angle on each side of the lab. These choices are made by flipping a coin, using a random number generator, or other method. So, postulating correlations between what determines such choices, and the spin properties possessed by the particles, strikes most physicists and philosophers as too conspiratorial. The other way to preserve locality is to postulate that the measurement events themselves causally influence or determine the prior properties possessed by the particles from the moment of their emission. In other words, if we postulate causation that goes backward in time, we can (technically) preserve the principle that all causal influences “travel” at a speed slower than c. As with superdeterminism, backward-in-time causation is usually considered much less plausible than simply accepting that nature displays non-locality.

Farewell to space?

You will recall that the connection between entangled quantum particles is completely independent of their apparent spatial separation. The particles behave, in fact, as though to interact with one of them is, ipso facto, to interact with both of them. This is also suggested by a natural, straightforward reading of the way that the two particles’ state is represented in quantum formalism. The “space” of that formalism is a many-dimensional mathematical space (a “Hilbert space” or “configuration space”) that has one dimension for every basic degree of freedom of each particle in a set of compatible or “commuting” degrees of freedom. If we represent a massive particle with spin, we need a 4-dimensional space: three dimensions for spatial degrees of freedom (location in x-direction, and y– and z-directions), and one more degree of freedom for its spin angular momentum state. If we represent a system of two massive particles with spin, now we need 8 dimensions; and so on and so on, if we add further particles. Entangled states then pick out special sub-regions of our high-dimensional space as those that are physically allowed, those from which measurement results then “select” an even more restricted sub-region. The key point is that the measurement process “touches” both particles at once, by selecting some region of the 8-dimensional space as the region representing the properties of the combined system.

Although some philosophers have defended this claim, the point is not that quantum theory invites us to believe that the true space of our cosmos is not our familiar 3D space, but some immensely high-dimensional “configuration space”. Rather, the point is that quantum theory invites us to consider the possibility that the 3D space of common sense is not as fundamental as we have traditionally believed it to be. Indeed, it invites us to consider the possibility that the 3D space of common sense may only be what philosophers call a mere appearance: a useful tool of our perceptual systems and our early scientific endeavours, but not a part of the fundamental “furniture of the world” at all.

Quantum theory invites us to consider the possibility that the 3D space of common sense may only be what philosophers call a mere appearance: a useful tool of our perceptual systems, but not a part of the fundamental “furniture of the world” at all

This may seem a radical conclusion to draw from experiments involving specially prepared particles, but in fact entanglement is a quite ubiquitous phenomenon in quantum theories, though its effects are not directly visible to us in daily life. Most interactions of particles and fields produce entanglement – when an atom emits a photon, the momentum of the atom and the photon are entangled, for example. And it is worth remembering that, shortly after the Big Bang, all the matter-energy we see around us was compressed together in a tiny spatial region, in which everything was essentially interacting with everything else. From a god’s-eye point of view, then, almost everything may be non-locally connected with almost everything else! This notion is radically at odds with the way we think of objects as separate, independent islands in a vast container that keeps things mostly well separated from each other. Little wonder, then, that many physicists have tried to find a way to avoid the conclusion of Bell’s argument.

Beyond ordinary quantum mechanics: quantum gravity and space

The entanglement-based violations of locality we have looked at are not the only hints provided by modern physics that lead us to think that 3D space, or 4D space-time, may not be fundamental features of reality.

Since the 1930s, physicists have been working to try to bring together general relativity theory and quantum mechanics. Unlike the other basic “forces” in nature – weak, strong and electromagnetic, all of which are addressed to perfection by quantum theory – the fourth force, gravitational force, has resisted integration into the quantum world. After decades of attempts to quantise gravity using standard techniques failed to yield the desired results, physicists began exploring a variety of different approaches, some of them with radically different foundations than those of either general relativity or quantum theory. Although no proposed quantum gravity theory currently commands widespread assent, several theories postulate fundamental entities that are either entirely non-spatial (e.g., causal set theory) or entirely non-spatiotemporal (e.g., some variants of loop quantum gravity). In some of these approaches, the idea is that space, or spacetime, becomes an emergent entity (or perhaps, an emergent property of the fundamental state of our universe). According to such theories, some of these possible “worlds” could be entirely lacking in space or spacetime as we know them.

It is far too early to extract philosophical lessons from any quantum gravity theory, but thinking about the possibilities is fascinating, for philosophers as well as physicists. Only in the future – perhaps a quite distant future – will humankind know what the cosmos we live in is really like, fundamentally speaking. And judging from the past 100 years, it seems likely that the quantum revolution’s challenges to our common-sense view of the cosmos are far from over!

  • References and footnotes

    1 —

    Einstein thought that, if arbitrarily distant events could influence events in the here and now, with a strength of impact not diminished by distance, then it would never be possible to rule out any hypothesis dreamed up to explain what happens here and now partly in terms of things happening on the other side of the globe, or even in distant galaxies. He therefore believed that the principle of locality was needed in order for experimenters to be able to isolate and control the factors that determine the outcomes of their experiments.

    2 —

    Entanglement has become an important tool and resource for the burgeoning fields of quantum computing and quantum cryptography, as described in other articles in this issue.

    3 —

    For example, (1,2) means that the particle arriving on the left side of the lab is measured along axis 1, and the particle on the right side of the lab is measured along axis 2.

    4 —

    Here and above, when I say “instantaneously” or “at the same time”, I mean “according to the reference frame attached to the laboratory”. Relativity theory tells us that observers passing by in fast-moving spacecraft would disagree as to which particle was measured first – this being the famous “relativity of simultaneity” in Einstein’s theories. But any and all observers, no matter how fast they are moving, would agree that any influence propagating from the first measurement to the second would need to travel faster than c.

Carl Hoefer

Carl Hoefer

Carl Hoefer is an ICREA research professor at the University of Barcelona and a PhD in Philosophy from Stanford University. He is currently director of the Barcelona Institute of Analytical Philosophy (BIAP). His research interests focus on ancient metaphysical questions that examine the metaphysics of nature that derives from scientific theories. In particular, he works on the nature of space, time and motion, especially in the field of Einstein's theories of relativity, and on the nature of objective probability. During the years 2009 and 2017 he was founding editor of the European Journal for Philosophy of Science, published by Springer. He has worked at the University of California, the London School of Economics and the Research Group in Epistemology and Cognitive Sciences (GRECC) of the UAB.