This article follows our event on February 4th, where we invited physicist Marius Paraschiv, PhD, and it is a partial summary of his presentation and the discussion. Find more about our upcoming events and join us for free by following our Facebook page or visiting our website.
The quantum theory, applied to physics in general and recently to more and more disciplines, is a great example of understanding the flow of ideas in science. The saying by Sir Isaac Newton, of seeing further than others mostly due to his standing on the shoulders of giants is a great fit for this context. The transition from a classical, deterministic theory to a quantum, probabilistic one is precisely the kind of idea which begs the deep question How could one ever think of such a way of understanding nature?
Quantization in Physics
The connection between experiment and theory, especially in the parts of physics that are invisible to the naked eye, for which one experiences the effects only, is often seemingly fractured. While some physicists are driven by inexplicable manifestations in some experiments, others want coherence and to fill in the gaps of theories.
Quantum theory is traced back to the German physicist Max Planck, in the exact year of 1900. Following experimental observations for which classical physics had no explanation, concerning emission spectra of atoms in rather common metals, such as filaments of light bulbs, Planck makes the daring proposition of quantization. He suggests banishing continuity at the subatomic level and speaks of atoms with a rationalized—hence fragmented—behaviour.
The quantum theory of matter was followed by another equally bizarre, experimentally motivated observations: the wave-particle duality. The particles that make up light, photons, can behave both as corpuscles, like the tiniest peppercorns, and as waves, like the ripples created by throwing a rock into a lake. Said dual character is not only inseparable, but also simultaneous. The theory was formulated following Albert Einstein’s experiment where he showed the photoelectric effect in 1905 and was daringly expanded almost universally by Louis-Victor De Broglie, in 1924. He proposed that not only photons could exhibit such a dual behaviour, but all matter could have a wavelength, hence a wavelike nature. De Broglie even wrote a formula to compute the associated wavelength of any object, not only at the subatomic level.
Quantization of Computers… for Simulating Physics
It should not come as a surprise that the first steps towards quantum computers were made by physicists. One of America’s most revered scientist, Richard P. Feynman published an article in 1981, titled Simulating Physics with Computers. Therein, Feynman argues that since the quantum manifestations of nature cannot be ignored—himself being awarded the Nobel Prize for such discoveries in 1965—it follows that any computer trying to faithfully simulate physics must equally be based on quantum principles.
Forty years later, construction of quantum computers has started, but it is rather in a proof-of-concept state, both in their theoretical foundations and the engineering.
The Main Difficulties
If we acknowledge that the quantum theory in physics refers primarily to subatomic particles, we naturally think that a computer designed to apply it is hard and (very) expensive to build. News about the experiments taking place at CERN mention accelerators and extremely sophisticated devices which are hard to build and control, far from the familiar computers that we have at home or at the office and even in our pockets or on our wrists. Among other inconveniences are their great sensitivity to movement and impurities, hence the strict requirement of controlled enclosures; also, some components must be kept cooled to below -270 C, very close to the absolute minimum of 0 Kelvin.
Although being true engineering marvels, the construction of which we will not even try to discuss, an essential point is that the greatest difficulty comes from the theoretical complexity that the quantum behaviour exhibits. If technology can and does advance beyond all imagination, much quicker than most predictions, the subatomic laws are, probably, definitive.
The phrasing at the end of the previous paragraph is both punny and suitable. The subatomic complexity is governed by probability theory, also known as stochastic calculus, from the Greek word stókhos (στόχος)—target, aim, goal. The Austrian physicist Erwin Schrödinger, a Nobel Laureate in 1933, rigorously introduced[1] this theory in the study of subatomic particle behaviour. Along with the wave-particle duality we mentioned works one of the most well-known and much discussed postulate of quantum mechanics: when observing a system, we are forcing it to change its state, hence we are denied any direct access to subatomic phenomena. Thus, probability theory adds that a state of a quantum system is always determined with a computed likelihood. Certainty, therefore, as well as classical determinism become forbidden. Such a phenomenon is manifest even in the mathematical and logical fundament which a computer operates with, the bit. A classical computer, like any digital device, stores, processes and transmits information in a binary encoding, at its lowest levels, that is using only 0 and 1, implemented electronically using specific voltages. Quantum computers, on the other hand, use qubits (a wordplay name from quantum bits). They are never certainly in the state 0 or 1, but only with some probability: for example, 75% in state 0 and 25% in state 1.
Equally important is the software. What could a computer do if it weren’t for programming languages to instruct it with? Such languages which work on quantum systems are actively developed, but their progress is not particularly fast, partly due to the rarity of hardware to test them on, as well as the implicit quantum phenomena themselves. The Q# language, developed by Microsoft, is an example, as is Qiskit, a software development kit (SDK) implemented using Python and developed by IBM.
Is the Computational Future a Quantum One?
To be brief, but also probabilistic, as any quantum response, not entirely. In 2019, IBM introduced the first publicly available unit, Q System One, priced around $10,000,000 and which used 20 qubits. In the fall of 2022, IBM launched Osprey, the most advanced quantum computer of our days, using 433 qubits.
Given that it only took three years for such progress, how come our answer to the initial question is in the negative then? From an engineering standpoint, one can expect technology to progress beyond any border of imagination, and it so often does. The limitations that remain are the theoretical ones, condensed in the expression quantum supremacy. Put simply, quantum supremacy is attained when one proves—either theoretically or through some experiment—that an algorithm implemented on a quantum computer can solve a problem which is impossible for classical computers. Said impossibility is itself theoretical and has nothing to do with computing power. The way classical computers work, based on mathematical concepts such as automata and Turing machines, makes them inherently powerless when confronted with some specific, but not negligible, problems, regardless of their hardware. The greatest example is the halting problem. But that is a topic for some other time.
References and Recommended Materials
Since the subject is extremely sophisticated even for people familiar with modern and quantum physics, we decided to include mostly historical and popular references. If, however, one is interested in (very) technical readings, we recommend:
M. Nielsen, I. Chuang – Quantum Computation and Quantum Information, CUP, 2000;
W. Scherer – Mathematics of Quantum Computing, Springer, 2019.
More popular articles, relevant mostly for the historical development of the subject, are:
Richard P. Feynman – Simulating Physics with Computers, 1981, an article which can be read here;
A series of articles by the Quanta Magazine, which treat various problems referring to quantum computers, algorithms, programming languages and theoretical problems;
The section devoted to quantum computers on IBM’s website, as well as their roadmap in this field;
A Medium article, which includes further reading materials,
If you prefer video content, we recommend:
A presentation of quantum computers, starting from the basic physics concepts, from Quanta;
A clip by Veritasium;
An expert presents the concept in 5 levels of complexity, in the series titled 5 Levels, by the Wired magazine.
[1] The origin of many ideas, theories and experiments which made the quantum revolution in the first decades of the twentieth century is hard to attribute individually. The flow of scientific ideas, as well as the diversity of the research teams, both in their experimental parts and theoretical (mathematical) ones made so that many physicists modify in a more or less significant way the results of some of their collaborators or contemporaries. Thus, one could treat separately a particular case that a specific theory did not cover.