“That is positively the dopiest idea I ever heard.” That is what the Nobel Physics Laureate Richard Feynman said to Daniel Hills when the latter told him that he wanted to start a company to build a computer with a million processors operating in parallel. Driven by his fascination for working on the “dopiest” ideas proposed to him, Feynman spent a period at MIT with Hillis’ team in the summer of 1983 to build the prototype of the Connection Machine, one of the first supercomputers. Not all companies have the good fortune to have a Physics Laureate designing their algorithms and processors. The fact is that Feynman had no difficulty in learning from scratch and specialising in a field, parallel computing, that had yet to get off the drawing boards.
The first algorithm run on the Connection Machine was to calculate logarithms, developed by Feynman himself 40 years earlier during the Manhattan project. Immediately afterwards, Feynman used this first supercomputer to perform particle physics calculations much more efficiently than on the computers available at the time. Clearly, our physicist was keenly interested in computing machines. Deciphering the mechanisms of nature is becoming increasingly expensive in terms of mathematical effort and more and more computing power is required the closer we get to tiny details. In fact, one of the stumbling blocks of the traditional or so-called “classical” computing is when it tries to simulate the physics of the smallest cogs of nature: quantum physics. This drawback was becoming increasingly apparent to the scientific community by the 1980s, and to Feynman in particular. Just a couple of years before his “summer stay”, he had given a seminar master class on the limits of classical computing and the need for quantum computing.
The first quantum revolution
Quantum physics first saw the light of day over a century ago. This physics theory was able to describe phenomena and experiments that what was known as classical physics was unable to explain. The consequences that emerged from this theory were surprising and, in many cases, counterintuitive, but they also enabled us to understand phenomena such as our Sun’s nuclear reactions or the properties of chemical elements and their reactions. This theory is the most precise of them all and, incredible as its predictions may seem to us, it is the theory that has been most thoroughly tested and validated in the last 100 years.
As soon as we humans discover how natural phenomena work, our instinct drives us to create tools that exploit them. Quantum physics was no exception. It was not long before the first applications and devices appeared that were designed thanks to the new understanding of quantum mechanics: laser, solar panels, GPS, magnetic resonance, the transistor… All of them were invented in the mid-20th century and comprise a period that is known as the first quantum revolution. These inventions, and many others, arose from the understanding of collective quantum phenomena.
Other technological revolutions were also taking place during the same period. Specifically, thanks to understanding the physics of semiconductors, the transistor (1947) and, subsequently, its successors, the microprocessors, appeared. With these devices, humans now became capable of performing increasingly complex automated calculations, with greater precision and without mistakes. Little by little, we learned increasingly sophisticated ways to process coded information in their smallest units, the bits: the famous 0 and 1 on which all classical computing is based.
Over a century ago quantum physics first saw the light of day, it was able to describe phenomena that classical physics was unable to explain. Incredible as its predictions may seem, it is the theory that has been most thoroughly tested and validated in the last 100 years
Some physicists also began to wonder what would happen if the information was coded in bits with quantum properties, called “qubits”. It was around this time that the field of quantum information emerged, although, compared with its sister, classical information, it developed almost entirely in the theoretical realm, as there were no devices that were capable of containing and controlling qubits.
Meanwhile, traditional computing was advancing at incredible speed and calculations intensified as computers became more and more powerful. By nature, we humans also tend to push our inventions to the limit and, in the case of classical computers, one of these limits lies in the quantum realm.
So that brings us to 1981, when Feynman got down to work. In one of his lectures, he highlighted the importance of building computers with a computing power that increased to match the size of the systems it was wished to simulate with them. This means, for example, that if we want to use a computer to add two numbers, its resources (memory or number of operations per second) must be large enough to cope with the size of the numbers we want to add together. However, if we want to multiply them, this computer’s resources must grow at the square of the size of the numbers we want to multiply. If the intention is to simulate quantum systems, the necessary resources must grow exponentially. With this type of growth, standard computers can be used without any problems for small quantum systems. However, as the systems become larger, eventually not even a supercomputer has enough resources to store so much information (let alone carry out operations). The reason is that the classical computers code information in bits, and to code information from a quantum state to a bit string, we need an exponential number. Feynman’s observation consisted of pointing out that if we used quantum bits instead of classical bits, we would not have this problem of exponential growth of computing resources. In other words, we must use quantum computers if we want to simulate and study quantum systems.
Does this mean that classical computers are useless for studying quantum physics? The answer is no. On one hand, this exponential growth of resources that we are trying to avoid only happens in the worst-case scenario: not all quantum phenomena or systems have such large computing needs. On the other hand, we can (and do) make approximations in our calculations that enable us to obtain good results. We have been using supercomputers for many years to study quantum systems such as chemistry, material science or particle physics, and this has enabled us to progress in a multitude of fields and applications. All of this gives us more reasons to pursue the invention of quantum computers that open the door to improving our understanding of the physics of the microscopic world and, consequently, also creating more applications.
Computing in the age of the second quantum revolution
Apart from Feynman, other physicists of the time had been working on the possibilities of quantum computing. Almost simultaneously, Yuri Manin reached the same conclusion as Feynman. Paul Benioff carried out an analysis on the mathematical model on which quantum computing could be based: the quantum Turing machine. During the 1980s, many physicists and mathematicians started to propose quantum algorithms and study their computational complexity. Quantum computing showed that certain algorithms can be accelerated substantially by the use of qubits.
The field was thrown into turmoil when the physicist Peter Shor proposed an algorithm that was able to efficiently factor numbers with a quantum computer. It wouldn’t have gone beyond being another mathematical application if it weren’t for the fact that all the cryptography we use today is based precisely on the fact that factoring isn’t at all easy. Shor’s algorithm enables an ideal quantum computer to break all current cryptography and poses a great risk to all present-day cybersecurity. This discovery put quantum computing under the spotlight in industry and governments by showing that a quantum computer could be used for much more than simulating complex physical systems.
The new millennium came and, with it, the first experimental quantum logic gates. Cirac, Zoller, Mølmer and Sørensen, among others, developed the theory that was applied experimentally almost immediately and which made quantum computing a reality. With the logic gates came the qubits. More and more proposals were being made about how to build quantum chips. Unlike traditional computing, which is silicon-based, many technologies are available for making qubits: trapped ions, photons, superconductors… All of them continue to be built and improved in parallel, as they have very different features.
Shor’s algorithm enables an ideal quantum computer to break all current cryptography and poses a great risk to all present-day cybersecurity
In short, the technology is already far enough advanced to make quantum computing a reality. And with that, companies and governments are starting to seriously invest in this second quantum revolution as, unlike the first revolution, it is no longer based on collective quantum phenomena: now we are able to control individual quantum systems. This technological revolution encompasses communications, sensors and, of course, computing.
Science, technology, sovereignty
We are living historic, never-to-be-repeated times. Quantum computer prototypes already exist and every year, we cross a technological milestone. Universities around the world have groups that are designing, building and improving quantum computers using different technologies. The companies and start-ups that are building these devices are starting to offer their services. Public and private investment is growing constantly and more and more potential applications are appearing.
However, not everything is as ideal as we would like it to be. Yes, we have quantum computers, but they are still small and imperfect. In order to implement the most powerful quantum algorithms that we know, we need millions of qubits that are virtually perfect, that is, any errors that may appear during quantum computing can be corrected automatically. Unfortunately, quantum computing is still not advanced enough for this to be possible. Present-day quantum computers are made up of a collection of “noisy” qubits (without error correction). We find ourselves faced with what is known as noisy intermediate-scale quantum (NISQ) computing. In spite of this, we continue to make progress and the technology improves year after year, as do the algorithms and the applications. There are many people who are working to get the most out of today’s quantum computers and prepare us for the quantum computers of the future.
At this point in time, we have two paths before us: stop doing things and wait for the technology to improve and for someone in the future to offer it to us, or take the initiative and develop this quantum computer of the future ourselves.
Europe is following the second path. Since 2018, we have initiatives such as Quantum Flagship, which entails a billion euros invested in quantum technologies distributed in five main areas: communication, sensors, simulation, computing and basic science. At present, the Quantum Flagship programme has started the second phase (technology transfer), which seeks to build the prototypes for the quantum applications studied during the first phase of the project. Thanks to this project, a lot of start-ups are being formed that are specialised in manufacturing quantum computers and their components. Europe has also included quantum computing in the recent European Chips Act. The message is very clear: Europe does not want to depend in the future on foreign classical and quantum chip technology, we want to be suppliers, and there is a clear commitment to supporting European industry and science.
At this point in time, when we have universities and research centres studying and developing the basic technology and when the necessary industrial fabric is being created to exploit this technology and build the quantum computers, where are the users in all this? After all, someone will have to use this computing to discover its applications.
Quantum computing has achieved a sufficient level of development for it to come out of the laboratories and be offered to users interested in what it has to offer. For almost 10 years, companies such as IBM, Google and Alibaba and start-ups such as Rigetti Computing, D-WAVE Systems and IonQ have been offering remote access to their quantum computers. Cloud providers such as Amazon Web Services are aggregating many of these quantum computers and offering a unique access environment to these machines. Although access to some small devices may become free, the fact is that the charges for accessing the more advanced quantum computers are increasing significantly, to the point of becoming prohibitive for most potential users. In a technology such as quantum computing, which promises to be highly disruptive, it is fundamental to guarantee access for the researchers and small business that want to study its possible applications. There is also the political side to this situation: who will have the knowledge and the industry required to build and use quantum computing? Will it be held solely in private hands? Which countries and regions will have these infrastructures?
This is where supercomputing centres are taking a step forward and offering their expertise in supercomputing services maintenance and provision to offer the same services in quantum computing. In the same way that the Quantum Flagship project is funding the scientific and technological development of quantum technologies (among them, computing), EuroHPC – the branch of the European Union that coordinates high-performance computing projects – has started funding projects to purchase, install and operate quantum computers in supercomputing environments. The aim of the EuroHPC initiative is twofold. It wants to guarantee public access to quantum computers, in the same way that access has been guaranteed to classical supercomputers for years and to support Europe’s technological fabric by buying technology developed in the European Union.
Technology is already far enough advanced to make quantum computing a reality. Companies and governments are starting to seriously invest in this second quantum revolution. However, quantum computers are still small and imperfect
And we mustn’t forget the scientific dimension: all quantum algorithms need a measure of classical computing. In order to culminate any quantum algorithm, a large part of the problem needs to be prepared and processed using traditional computing. The main idea is that only part of the algorithm (the most expensive part) is run on the quantum chip, while the rest is traditional computing. Furthermore, quantum computer design and control can substantially benefit progress in classical computing. So quantum computers will not compete with supercomputers but will be part of them. A supercomputer is nothing more than many computers connected together and working in parallel. Just as today’s supercomputers can contain different types of processor (CPU, GPU, TPU…), they can also contain quantum processors, which will only be used for the applications in which traditional computing may be insufficient due to the nature of the problem to be solved.
Deployment of quantum computers in Europe and Spain
At present, there are different prototypes of quantum computers distributed by universities, research centres and companies in Europe. Most of them operate in a development environment, that is, access is restricted to the scientists who build and study them. Initiatives are emerging to broaden this access to other users for the more advanced prototypes. Countries such as France, Finland, Germany, Italy, the Netherlands or Spain have started projects that seek to provide access to quantum computers. EuroHPC is also supporting the European Quantum Computing & Simulation Infrastructure (EuroQCS) by installing European-made quantum computers using different technologies in supercomputing centres.
Specifically, in Spain, we have the Quantum Spain project, targeting the Spanish Supercomputing Network (RES) and coordinated by the Barcelona Supercomputing Center (BSC). The aim of this project is to install a quantum computer in the BSC, to which access will be provided without charge through the RES, following the same protocols for access to the network’s supercomputers. The company commissioned to build this quantum computer will be the Spanish start-up Qilimanjaro, jointly with the telecommunications corporation GMV. This joint venture will be in the hands European technology providers, thereby meeting the objectives set by the European Union to give priority to technological sovereignty. As well as having this quantum computer, the RES will also develop quantum emulators that enable simulation of the behaviour of quantum computers up to a certain size in a controlled environment and to study quantum algorithms without having to use a real computer. Quantum Spain also seeks to develop new algorithms and applications. Consequently, the network cooperates with a large number of Spanish universities and research centres that are experts in this field. Lastly, a basic component of any technological development is training the new generation of experts in this technology. Another of this project’s goals is to promote all activities and initiatives that contribute to attracting talent to quantum computing.
Thanks to the Quantum Spain project and the country’s proven expertise in supercomputing, Europe has chosen Spain to be one of the first EuroQCS nodes. The BSC will have a second quantum computer funded by the European Union through the European High Performance Computing Joint Undertaking (EuroHPC JU). The two quantum computers will be integrated in the MareNostrum5 supercomputer, which is one of the most powerful in Europe. Thus, Spain will have a pioneering, heterogeneous computing infrastructure, with processors having different features. The two Spanish projects, Quantum Spain and EuroQCS-Spain, have been possible thanks to the funding provided by the Spanish Secretariat for Digitisation and Artificial Intelligence with funds from the Recovery, Transformation and Resilience Plan (PRTR).
Looking to the future
The European endeavour to attain technological sovereignty was abandoned for many years in favour of the supposed benefits of globalisation. Ongoing geopolitical tensions, aggravated by the global coronavirus pandemic, have exposed all the EU countries’ industrial shortcomings and have revived efforts to transform Europe into a supplier and not just a customer.
Although technological sovereignty must be a clear goal of European policy, we must not forget that science and knowledge know no borders, and research policies must be capable of achieving a balance between protecting intellectual property and fostering scientific and industrial collaboration. Countries such as the United States, Canada or China have more advanced technology than Europe. At the same time, Europe is a large discovery and knowledge generator and, consequently, in many cases, is a major talent exporter. Quantum computers are still immature and their true potential has yet to be discovered. In order to face such a huge technological challenge, we must all look for the best way to cooperate, without forgetting all the talent that may remain hidden in other countries traditionally less focused on this field of science.
Tots els algoritmes quàntics necessiten un component de computació clàssica. Els ordinadors quàntics no competiran contra els superordinadors, sinó que en formaran part
Ultimately, quantum computing is just another example of the fruits that can be obtained when an obstinate bunch of scientists decide to ask themselves questions about how the cogs of nature work and “what would happen if…?”, without necessarily focusing on future applications or inventions (which would take years to come), simply for the pleasure and mission of expanding the frontiers of knowledge, and decide to work, in many cases, on the dopiest ideas they have ever heard. Thanks to them, today we reap the fruits of the science of the past, while at the same time planting the seeds of the science and technology of the future.
-
References and footnotes
1 —The original citation is: “That is positively the dopiest idea I ever heard.” Hillis, D. (1989) “Richard Feynman and the Connection Machine”. Physics Today, 42(2): 78.
-
Referències bibliogràfiques
- Benioff, P. (1980) “The computer as a physical system: A microscopic quantum mechanical Hamiltonian model of computers as represented by turing machines”. Journal of Statistical Physics, 22(5): 563-591.
- Binosi, D. Calarco, T. Colin de Verdière, G. Corni, S. Garcia-Saez, A. Johansson, M. P. Kannan, V. Katz, N. Kerenidis, I. Latorre, J. I. Lippert, Th. Mengoni, R. Michielsen, K. Nominé, J. P. Omar, Y. Öster, P. Ottaviani, D. Schulz, M. Tarruell, L. (2022). “EuroQCS: European Quantum Computing & Simulation Infrastructure”. Quantum Flaghship. Disponible en línia.
- European Commission. “European Chips Act” (European chips law). Disponible en línia.
- Press release in October 2022 on the European High Performance Computing Joint Undertaking (EuroHPC – Joint Undertaking). “Selection of six sites to host the first European quantum computers”. Disponible en línia.
- Feynman, R. P. (1982). “Simulating physics with computers”. International Journal of Theoretical Physics, 21: 467–488.
- Manin, Y. (1980). “Computable and Uncomputable”. Sovetskoye Radio, Moscow, 128.
- Preskill, J. “Quantum computing 40 years later”. Feynman Lectures on Computation. 2nd ed., published by Taylor & Francis Group, edited by Anthony J. G. Hey. arXiv:2106.10522 [quant-ph].
Alba Cervera Lierta
Alba Cervera Lierta is a researcher at the Barcelona Supercomputing Center. She holds a PhD in Quantum Computing and Information from the University of Barcelona and an MSc in Particle Physics. After obtaining her doctoral degree, she went to the University of Toronto as a postdoc researcher in the Alán Aspuru-Guizik group. Her fields of study revolve around quantum computing and its short-term applications, and the synergies between quantum physics and artificial intelligence. Since October 2021, she has coordinated the Quantum Spain project, an initiative to support the quantum computing ecosystem and whose goal is to operate a quantum computer at the BSC-CNS.