The meaning of history
It is no exaggeration to say that history is constantly being rewritten as the years go by. Mostly because of new findings or analyses that complement or overturn current narratives, but also – and this is more interesting – because of the transformations that take place in contemporary outlooks. So history is changed by future events, on the one hand, and the way it is seen, on the other. They point our attention toward past events that would otherwise have seemed trivial or would have even remained invisible.
In the history of science, this process is known, studied and also feared. A presentist vision is one that judges past events solely on the basis of present values. In order to avoid this, when starting a study, one must understand the case studies in their own context, beyond their subsequent vicissitudes. Obviously, the subsequent success of a technique or the generalised confirmation of a theory often changes the perception of its origins, particularly with respect to the other techniques and theories with which it competed. As we know, history is written by the victors.
In the more specific domain of the history of quantum theory, the expansion of quantum computing and all the science that was distinguished with the 2022 Nobel Prize in Physics, bears witness to one of these transformations. Episodes that until a few years ago were not an essential part of the canonical history of the discipline now feature very prominently.
History is constantly being rewritten as the years go by, mostly because of new findings or analyses that complement or overturn current narratives, but also because of the transformations that take place in contemporary outlooks
In an interview that we can find on the Nobel Foundation website, professor Thors Hans Hansson, member of the Swedish Academy of Sciences, explains – referring to the decision to award the prize in 2022 – how quantum computing shows “what quantum mechanics really means”. [1]1 — See the entire interview on the Nobel Foundation website. Available online. It is a sentence we should pay attention to. Imre Lakatos argued that the main difference between how he understood scientific progress and how Karl Popper understood it was that, for the latter, we will never know how a theory should be modified until it has been refuted; for Lakatos, on the other hand, we will never know how a theory has been refuted until it has been changed. [2]2 — Lakatos, I. (1974). The role of crucial experiments in science: Studies in History and Philosophy of Science, no. 4: 309-325. According to this, in the case concerning us, the latest developments show us more precisely where the classic theory failed.
2. The second quantum revolution (1970-2022)
In the early years of the 21st century, people were starting to talk about the second quantum revolution, referring to the technological progress that has enabled us to glimpse the possibilities of quantum computing and simulation. The first revolution was the conceptual revolution, which took place 100 years earlier. We will look at the conceptual implications that the second revolution has for the first revolution.
John Clauser, Alain Aspect and Anton Zeilinger were given the prize for their ingenuity and their experimental praxis, which has succeeded in eliminating the famous loopholes in the settings that seek empirical evidence for entanglement. Little by little, the gaps that could challenge confirmation of this property – entanglement – which today symbolises the essence of the quantum realm, have been closed. They found that measurement in one of the subsystems instantly affects the state of the other subsystem with which it was entangled before measuring. Since the first rudimentary experiments performed with photons from atom de-excitation cascades, in the late 1960s, to the current settings which contain different photons and even mass elements, the key to everything that everyone has wanted to detect and illuminate is entanglement. For the enormous interest it holds in areas such as encryption, but above all because it is a distinctive, fascinating property that the objects of nature possess at the atomic scale and which raise the possibility of instant links between distant objects.
Since when has this property become so important in modern physics? [3]3 — For a detailed account of these happenings (until 2005), see Gilder, L. (2008). The age of entanglement: When quantum physics was reborn. New York: Alfred Knopf. Let’s go back further.
Let’s go back further.
After the Second World War (1939-1945), research in physics took off again. The number of physics students increased considerably, especially in the United States (the European countries would take a few more years to recover). Obviously, there was a reason for this boost, which was aimed particularly at accelerating the development of technical applications. The new generation’s background in more humanities-related disciplines, such as philosophy, declined considerably in comparison with the scientific disciplines of the first third of the 20th century.
The new outlook and the number of students in the classrooms led to substantial changes in the teaching content and methodologies, and greater weight was given to calculations and less to conceptual matters. This new panorama is often defined with the formula “shut up and calculate”: the debate on fundamentals must give precedence to the increased focus on calculations. For those in power, this approach received added legitimacy and desirability from the technological rivalry with the communist bloc.
David Kayser has convincingly argued that the protest movements of the late 1960s were reflected, in the field of physics, in a group of young people who, in the early 1970s, refused to accept this trend and vindicated and revived the debate on the interpretation of quantum mechanics. [4]4 — Kayser, D. (2011). How the hippies saved physics: Science, counterculture and quantum revival. New York: Norton & Company. In more than one university in California, small groups of physicists were formed that questioned the so-called Copenhagen interpretation (a term used to refer to quantum orthodoxy that had only been coined a couple of decades earlier, precisely in criticism of alternative theories) and they tried to establish links with the teachings of eastern religions and psychic phenomena such as telepathy. They approached the discussions on the meaning of quantum theory by drawing particularly on Bell’s theorem.
Entanglement symbolises the essence of the quantum realm, for the enormous interest it holds in areas such as encryption, but above all because it is a fascinating property that the objects of nature possess at the atomic scale and which raise the possibility of instant links between distant objects
In 1964, John S. Bell had published a theorem with which it was possible to distinguish experimentally whether or not a system perfectly described by quantum mechanics could be encompassed within a causal description. According to quantum theory, the parts of an entangled system do not have defined parameters until they are separated by a measurement. And when this measurement is made, the parameters are determined immediately for all the subsystems. Bell found a way to assess in the laboratory whether these parameters were fixed before the measurement (even though the experimenters did not know the values) or whether (as prescribed by quantum mechanics) they were only determined at the precise moment when they were measured. A theoretical physicist by profession, Bell had to prepare and discuss this contribution outside working hours. At that time, many physicists (and journal editors) considered that these issues did not properly belong to the realm of physics. Entanglement was far from representing “what quantum mechanics really means”.
The Cold War and hidden causality (1945-1964)
A victim of the Cold War, the North American David Bohm had already experienced such belittlement at the beginning of his career, along with a forced exile from the United States, in 1949, because of his communist views. However, it was his contributions that piqued Bell’s curiosity. Bohm had completed a doctoral degree that contained classified material, because of its connection to the Manhattan Project. In his case, politics and physics went hand in hand: dialectical materialism underpinned the realism and determinism of his ideas. In other words, poles apart from the hippies. When Bohm abandoned communism, he also abandoned the causal explanation and he even made some proposals that took into account the role of consciousness in the new physics, spurred by his relationship with the Indian thinker Jiddu Krishnamurti (“the observer is the observed”), now at last in line with the young scientists who took up his reflections in the early 1970s. [5]5 — Freire, O. (2015). The quantum dissidents: Rebuilding the foundations of quantum mechanics (1950-1990). Heidelberg: Springer.
Bohm had not accepted uncritically the orthodox interpretation of quantum mechanics and, in 1952, he formulated a causal theory that reproduced known results of quantum mechanics. At that time, the randomness of elementary processes, that is, the impossibility of developing a completely deterministic atomic theory was its most controversial feature. It must be said that neither Albert Einstein nor Erwin Schrödinger, two illustrious dissidents, were very interested in this proposal. However, there is no doubt that Bohm’s work contributed substantially to putting problems such as measurement or the possibility of designing a deterministic theory at the centre of debate, contrary to what was thought and what a theorem by John von Neumann had prematurely ruled out in 1932. Bohm received more recognition at the end of his life (he died in 1992), when the topics he had touched upon progressively became mainstream as a result of social, political and technological transformations.
This episode illustrates beautifully the fruitfulness of controversy and debate in science. Even though Bohm did not succeed in constructing an alternative theory, his contribution was crucial for putting the possibility of developing one at the centre of debate, for asking what measuring means and for seriously considering non-locality. The caloric theory of the 18th century or the ether of the 19th century are ideas that played a significant role in the emergence of new theories (thermodynamics or relativity) and should not be erased from the history of physics or passed off as anecdotic episodes. Many of their features have left their mark on the final form of modern-day theories. This also applies to Bohm’s ideas and the place of Bell’s inequalities today.
The uncertainty of reality and the first dissidents (1925-1945)
Continuing with our backward travel in time, we now come up against an obstacle that we have already mentioned: the Second World War, difficult times for dissidents. In the immediately preceding years, quantum research was confined almost exclusively to nuclear physics. Debate on the fundamentals was virtually non-existent. However, it was not long before the first symptoms emerged of disenchantment with the hegemonic interpretation that had been established after the foundational discussions of the 1920s. It was then that the concept of entanglement, absent in the early debates, burst onto the scene. Although it did not yet hold centre stage.
Indeed, after a more or less consensual agreement in the autumn of 1927 (when the matter was symbolically closed at the fifth Solvay conference), critical voices began to emerge in the early years of the following decade. Without doubt, the most important papers in this respect are those by Albert Einstein, Boris Podolski and Nathan Rosen, and by Erwin Schrödinger, which were both published in 1935. The two contributions are related because the paper by Einstein, Podolski and Rosen encouraged Schrödinger to publish his critical analysis of the state of quantum theory. He called it a “confession”.
Einstein, Podolski and Rosen had devised a thought experiment which made it possible to obtain magnitudes of one subsystem by measuring another. An ingenious way of giving reality to properties supposedly excluded from the theory by the uncertainty principle. Thus, they concluded that it was evident that quantum mechanics was not a complete theory. The authors considered that eliminating variables that were classically indispensable for completely defining a state and this elimination’s dependence on the type of experiment performed was the murkiest aspect of the dominant interpretation.
A few months later, Niels Bohr replied that, in actual fact, the variables chosen by the authors were not subject to the limitations of the uncertainty principle, dismantling their argument. However, he acknowledged and developed the point made and rethought the meaning of the uncertainty principle itself, the emblematic representative of the new mechanics, that until then had been understood as a physical consequence of the disturbance caused by measurement. [6]6 — Beller, M. (1999). Quantum dialogues: The making of a revolution. Chicago: Chicago University Press. After this debate, the entanglement between subsystems rendered it necessary to qualify the concept of disturbance, because it could instantly affect distant systems.
It was in Schrödinger’s discussion (before Bohr’s answer) that the term entanglement (Verschränkung) was coined with the new meaning. However, Schrödinger also wanted to focus on the superposition of states by inventing the famous cat paradox. If the superposition of possible states was also a possible state, this would lead to glaring contradictions (at least in the domain of everyday objects). In addition, in another article published in the same year, he took the effects of entanglement further and proposed a way of changing a subsystem’s state using measurements in another subsystem that is separated spatially from the first subsystem; today, it is known as quantum steering in the context of quantum information and computing.
Schrödinger wanted to focus on the superposition of states by inventing the famous cat paradox
So, in 1935, we already have the concept of entanglement defined, ten years after quantum mechanics was formulated for the first time (Werner Heisenberg, 1925). It was a striking consequence of the theory, but not its most significant implication. However, it was a point that was sufficiently obscure for Einstein or Schrödinger to devote attention to it. Einstein once referred to it as “spooky action at a distance”.
Humble beginnings: the quantum leaps (1900-1925)
So now we come to the end of our story, the origin of quantum theory. What new properties of matter and radiation drew the attention of the first physicists who studied it?
There are arguments for attributing the honour of discovery to Max Planck in 1900, but also to Einstein in 1905. [7]7 — Duncan, T.; Janssen, M. (2019). Constructing quantum mechanics. New York: Oxford University Press. Basically, the differing opinions revolve around two aspects: on the one hand, the awareness of the significance of the step being taken (the quantization of energy); and, on the other, the radicalness of the quantization proposed. Einstein argued that the energy exchange between light and matter takes place in packets, and extended this to the constitution of light itself. Planck downplayed the importance of quantization, and even
So Einstein was the first to point out the strangeness and significance of the hypothesis that gave the theory its name. But his version included a first feature that might allow the supposition of non-locality traces. If light is transmitted as a wave but exchanges energy in packets, there must be some kind of instant concentration, a non-local space-time process that eludes the explanations of electromagnetism. At that time, this was not taken into account because, in general, Einstein’s hypothesis was sidelined: nobody wanted to replace the wave theory of light with a theory that explained little more than the photoelectric effect. The concept of duality was still several years away. In the case of Einstein, only a few. In 1909, confused by these contradictory manifestations of light, for the first time, with almost no support, he suggested that the future solution to the enigma would come from a compromise between wave theory and particle theory. [8]8 — Navarro, L. (2020). El desconocido Albert Einstein. Barcelona: Tusquets.
The next major milestone was Bohr’s atomic model, in 1913. Using postulates, the Danish physicist overcame two giant problems faced by the builders of atomic models: hyperstability (elements are collision-proof structures) and the source of spectra. He postulated the existence of stationary states in which the electrons did not radiate, in spite of being accelerated; instead, in his model, the radiation emitted by the atoms was solely dependent on the energy difference between the stationary states involved, and had nothing to do with the frequency of their orbital motions, as Maxwell’s equations envisaged. Beyond this axiomatization contrary to current physics, some of Bohr’s colleagues objected to the loss of causality: if an electron emits at the frequency corresponding to this energy difference, it must do this during a process that has not yet finished and whose final state determines the frequency of the radiation emitted previously. This cast a new shadow on causality (in this case, a reversal of the cause-effect relationship) but, at the time, it failed to ignite any controversies.
There are arguments for attributing the honour of discovery to Max Planck in 1900, but also to Einstein in 1905. The next major milestone was Bohr’s atomic model, in 1913
To end our journey, it is worth jumping forward to the early 1920s, just before the birth of quantum mechanics, when Louis de Broglie introduced wave-particle duality and Einstein included it in his treatment of the ideal quantum gas in 1924. It enabled him to explain a mysterious property of gas particles in the new theory: they behaved as a whole, the particles’ individuality had vanished to the point that it no longer made any sense to locate them or imagine them separately. This is what many years later would be known as quantum indistinguishability. The whole was defined, the parts were not. Einstein himself wondered about this type of interaction or connection that kept all the particles in the gas functioning collectively. Of course, for the father of relativity, an interaction – no matter how mysterious and unknown – could not be instantaneous, so he used de Broglie’s hypothesis to account for it.
Entanglement emerged again, this time with greater force, but it was still not understood as the marrow of the quantum bone. Particles became indistinguishable, which was a way of including a kind of field in the old particle idea that kept them together and interconnected.
It’s a little confusing but, with time, it would become clearer and more defined. Quantization, discretization, uncertainty, superposition, randomness … and entanglement. It remains to be seen what forthcoming studies and developments will bring us.
-
References and footnotes
1 —See the entire interview on the Nobel Foundation website. Available online.
2 —Lakatos, I. (1974). The role of crucial experiments in science: Studies in History and Philosophy of Science, no. 4: 309-325.
3 —For a detailed account of these happenings (until 2005), see Gilder, L. (2008). The age of entanglement: When quantum physics was reborn. New York: Alfred Knopf.
4 —Kayser, D. (2011). How the hippies saved physics: Science, counterculture and quantum revival. New York: Norton & Company.
5 —Freire, O. (2015). The quantum dissidents: Rebuilding the foundations of quantum mechanics (1950-1990). Heidelberg: Springer.
6 —Beller, M. (1999). Quantum dialogues: The making of a revolution. Chicago: Chicago University Press.
7 —Duncan, T.; Janssen, M. (2019). Constructing quantum mechanics. New York: Oxford University Press.
8 —Navarro, L. (2020). El desconocido Albert Einstein. Barcelona: Tusquets.
Enric Pérez Canals
Enric Pérez Canals is associate professor at the University of Barcelona's Department of Condensed Matter Physics. His research is focused on the history of modern physics, specifically, on the interrelations between quantum theory and statistical physics. He has been teaching History of Physics at the Faculty of Physics since 2010, and is co-author of the book Física estadística. De estados y partículas: una mirada nueva a viejas controversias (Statistical physics. Of states and particles: a new look at old controversies) (Edicions UB, 2018) with Pere Seglar i Comas. He has also written and directed short plays based on episodes and concepts in modern physics.