On the simulation of our reality

In de serie ‘Students on Science’ presenteren we Engelstalige artikelen die zijn geschreven door studenten van het vak Wetenschapscommunicatie aan de UvA. Vandaag het eerste deel van een nieuwe serie, behorende bij een nieuw collegejaar. Vic Vander Linden beschrijft hoe we de realiteit kunnen simuleren met behulp van quantumbits.

binary
The real world, or a simulation? Image: Keith Edkins.

The year is 2055. After living a large part of your life peacefully and well, you’re finally pulled out of the simulation and into the true life. What would you see? Scientists think they can already answer this question. The simulation itself would not be built out of zeros and ones like in The Matrix, but out of quantum bits! This new informational concept has many weird but fascinating properties. It seems to violate fundamental physical laws, causing Einstein to reject the underlying ideas already in the early 20th century, reasoning: “God does not place dice”. However, years later, these quantum bits seem to be the true informational building blocks of reality.

Quantum bits

When we zoom in on the smallest parts of our universe, such as electrons and quarks, there seems to exist an entirely new set of physical laws. These laws make up the quantum theory. While scientists have yet to find a resolution between this quantum theory and the theory of very big things – general relativity – this quantum playbook has made a plethora of correct predictions about the nature of the universe. Consequently, a lot of attention has gone to understanding the concept of information in this world, so-called quantum information theory. Ordinary information theory uses bits, strings of zeros and ones, to convey complex information such as the daily Wordle. For example, in modern computers, a bit is usually a small electrical signal or the absence thereof. While a bit is fundamentally either a zero or a one, a quantum bit can be ‘in between the two’, or perhaps better phrased: a bit of both. This is the way quantum information theory uplifts a bit to a quantum bit. The concept of a quantum bit is, similar to a bit, a mathematical concept, used to quantify the fundamental pieces of information. However, it vastly differs from regular bits in many ways. While a bit can also have a 50% probability of being a zero and a 50% probability of being a one, a quantum bit is allowed to be a zero and a one at the same time. This concept is called superposition, and it has stumped scientists for many years. In an attempt to explain it properly, allow me to indulge in the following analogy. Let’s say there are two different dresses, a blue one and a golden one. If I flip a coin and wear a certain colored dress based on the outcome, I will have a 50% chance of wearing a blue dress and a 50% chance of wearing a golden one – nothing new here. But now, quantum mechanically, I have only one dress which is in a superposition of blue and gold at the same time. Similarly to the viral picture of a dress that blew up in 2015, it is not until you look at the dress that an actual color gets manifested. By looking, measuring, you break the superposition. And analogously to what happens for a true quantum superposition, once observed, the dress will always remain the same color.

dress
The DressAn illustration of the dress that went viral in 2015. While good for the analogy, the true nature of the dress duality has to do with the human perception of color.

Proving quantum?

The breaking of superposition that happens after a measurement has been experimentally supported in 1975, by A. Aspect, J. Clauser and A. Zeilinger, with their work on entangled light particles. Using lasers, they polarized light particles in a certain way to allow for superpositions between them. The experiment then conducted the famous Bell’s trial, a statistical test that can show this superposition behavior on multiple quantum bits together. With this work, they won the 2022 Nobel Prize for physics. However, while this test is very convincing, it actually does not provide definite proof of quantum theory as the underlying structure. It merely shows that any theory must behave in this quantum way, thereby excluding the possibility of ordinary bits as the fundamental pieces of information.

Quantum Computing

Quantum computing is a broad term for experiments done using quantum behavior. It can be broken down into two categories: computing quantum behavior with quantum machines on the one hand, and simulating classical, non-quantum problems with the same quantum machines on the other hand. The latter seems nonsensical, but one of the key advantages of quantum bits with respect to their regular counterparts is their processing capability.  Qubits in a superposition are used as intermediate computational tools in quantum algorithms. These algorithms exploit the fact that superposition qubits provide a much larger sandbox of unique states. This allows for computations that are classically virtually impossible, which Google was the first to do in 2019 with the machine underneath.

However, the problems where such major steps forward can be made using quantum computers are often cherry-picked examples where quantum systems have a natural edge. As an example, the problem for which Google claimed quantum supremacy was simulating the output of a process constructed out of random quantum operations. While the randomness of the operations assured that results of the experiment were generalizable and not biased to one process, some scientists argued that the problem itself had too much of a quantum nature to be considered a milestone achievement. Currently, problems simulated on a quantum computer often possess an inherent probabilistic nature and their solutions are rarely applicable to real-world situations.

The second use case for quantum computers is to simulate systems that are inherently quantum themselves. With experiments such as the Bell test suggesting that our reality is inherently quantum mechanical, this would be the way to accurately simulate it. As a result, if the aliens from The Matrix were truly scientifically advanced, they would have definitely used a quantum simulator. However, it must be noted that the maximum number of quantum bits in a quantum computer currently remains around a thousand. Furthermore, the quantum bits themselves are often very prone to errors, which has spurred a new field of quantum error correction. Google’s new quantum chip, Willow, is a breakthrough on this front, as with it, Google appears to have found a way to reduce errors with increasing qubits. However, the problem the computer addressed remained of the same probabilistic nature as the one in 2019.

quantum computer
A quantum computer. The IQM Quantum Computer installed in Espoo, Finland. Image via Wikimedia Commons.

While the mathematical definition of qubits is universally accepted, what is the best method to create them remains an open question. There is a true arms race between different companies pioneering their respective methods for achieving the highest number of reliable qubits. Companies may use photonic qubits, trapped ion qubits or superconducting charge qubits, where each exploits the quantum behavior of different particles. The superconducting charge qubit, being used by the likes of IBM and Google, is created by using a small electric signal in a superconducting loop. Microsoft recently claimed a new method, Majorana qubits. Physically, they are quasi-particles that only exist in a superconducting nanowire. The main difference with other known superconducting chips is that the qubit is spread over the whole wire, yielding topologically protected qubits which reduce the overall error.

The real-world examples of where quantum computers may be used are vast and range from applications in materials science, cryptography or artificial intelligence, to optimizing complex flight schedules and financial modeling. One of the most likely future applications is the breaking of previously unbreakable RSA encryptions. These would be cracked in a heartbeat by the Shor-algorithm and may cause many consequences ranging from exposing your current WhatsApp messages, to more serious impacts on critical infrastructures.

Quantum entanglement

How are these quantum computers capable of such intensive calculations? As mentioned, this comes from the fact that multiple quantum bits interacting increases the possible intermediate computational states. The interactions lead to a property called quantum entanglement. Using the dress analogy once more, entanglement is like having quantum shoes with the dress that have been linked, or entangled, but are currently in a different room. In this case not only is the color of the shoes uncertain until observed, but looking at the color of the shoes in one room can break the superposition of the dress in the other room, causing it to assume one color as well. Einstein called this “spooky action at a distance” because, regardless of the distance between the two entangled quantum bits, observing one at a certain place and time will instantaneously alter the state of the other quantum bit. While this seems to allow for information to travel faster than the speed of light, which is the speed limit imposed by Einstein’s theory of relativity, it actually does not. The exchange of information is merely an illusion. This is because the person making the measurement has no control over the way the entanglement breaks, or in other words, which color the dress ultimately becomes. As a result, reading the color of the dress will indeed change the state of the shoes, but it will do so in an uncontrolled manner – it cannot be used to transmit information.

Outlook

Will there ever be a moment humankind is able to simulate the whole universe? Categorically, no. It would require all the resources of the entire universe itself. However, there might come a time where both the number of qubits and the absence of noise on those qubits improve significantly. Perhaps one day, true alternative lifeforms, built out of the same quantum building blocks as we are, may originate in these simulations.  However, such feats seem many years away. Nonetheless, as the qubit count continues to increase, the field of quantum information is a fascinating example of pioneering twenty-first century research.