Innovate

Challenges of Quantum Computing

Scytl
EDGE Elections
Published in
5 min readMar 21, 2024

--

Quantum computing is getting closer to becoming a staple technology. Large tech companies are heavily investing in quantum technologies and research institutes are intensively working on new algorithms. But how far are we from having practical quantum computers? Are there any challenges still pending to be solved? As we briefly introduced in our previous article, there are still some hurdles on the path to quantum computers since they are much more vulnerable to errors than classical computers. In this article we are going to dive deep into some of the main challenges: quantum decoherence, error correction and scalability.

Compared with standard computers, quantum computers are extremely susceptible to noise. The quantum state of qubits is extremely fragile and any disturbance, such as a slight vibration or a change in temperature, can uncontrollably affect the computer, causing information stored to be lost. It can be easily understood if we think of a flipped coin landing on its edge. Ideally, without wind or vibrations, the coin would stay upright. Nevertheless, in the real world, any small disturbance can cause the coin to fall on one of its sides, like a qubit losing its quantum state and consequently causing errors in computations. This phenomenon is called quantum decoherence.

Decoherence poses a challenge in quantum computing since, in order to avoid errors, quantum computations must be completed before decoherence occurs, i.e., during the coherence time of a qubit, which is relatively short. Although increasing this timeframe is a significant area of research, a popular approach to tackle quantum decoherence is the development of error correction codes.

The aim of quantum error correction techniques is to detect and correct errors caused by quantum decoherence before they can affect quantum computations. Although correcting errors using codes is not a new concept, it becomes more challenging in the quantum setting than in the classical one. While in classical computers errors are quite rare and often occur as bit flips (0 accidentally changes to 1 or vice versa), in quantum computers, the frequency of errors is much higher and they can manifest as phase flips, bit flips, or a combination of both. In addition, due to the no-cloning theorem in quantum mechanics, qubits cannot be simply replicated as it is done with classical bits. For all these reasons, quantum computing requires more complex error correction codes.

In Quantum Error Correction (QEC) the quantum information stored in one logical qubit is protected from errors by encoding it into several physical qubits. Logical qubits are used for programming and physical qubits are the physical materials we use to process information (ions, photons, superconductors). When we say that IBM built a 433-qubit quantum computer or Google announced a 72-qubits quantum processor, we are talking about physical qubits. The ratio between the number of logical and physical qubits varies depending on the hardware used, the algorithm run, or the specific application, but overall, the higher the error rate is, the more physical qubits you need.

There are many codes widely used for quantum error correction, each one representing a different way of encoding quantum information over physical qubits. Some examples are the Shor code (1 logical qubit is encoded into 9 physical qubits), Steane code (1 logical qubit is encoded into 7 physical qubits) or Surface code (family of quantum error correcting codes defined on a two-dimensional lattice of qubits). Also, both academia and industry are intensively researching this topic, although, as mentioned in a Quantum Insider post, “despite very impressive results, it is clear that we are still a long way from practical and usable QEC”.

Scalability refers to the ability to increase the number of qubits in a quantum system to allow for the resolution of more complex problems. As previously mentioned, qubits are notoriously fragile, and any small interference can cause errors in computations. This fact, together with the difficulty of connecting large numbers of qubits to create larger systems, makes scalability a challenging process. One possible solution is the use of error correction techniques which involve building redundancy into the system to allow for the detection and correction of errors. However, as previously mentioned, these techniques require the use of additional qubits and complex algorithms, which can further complicate the scaling process.

Currently the largest quantum computer has 1180 qubits, as is explained in a NewScientist article, and the number of physical qubits needed to factor an integer of 2048 bits in 8 hours is estimated at 20 million physical qubits. The authors of the latter assume that qubits are built using superconducting circuits, and the error correction code is done using the surface code.

So, how long until we have million-qubit machines? IBM predicts having a 1 million qubits quantum computer by 2030 and Google by 2029, both working on superconducting qubits. On the other hand, the start-up PsiQuantum is planning to make a one million photonic qubits quantum computer with error correction (about 1000 error corrected qubits) by 2025.

The Global Risk Institute publishes a report each year that “aims at providing an educated perspective of how far away the quantum threat is, by collecting and examining the perspectives of global experts from academia and industry, involved in diverse facets of quantum computing.” In the 2023 report, 37 leading experts on quantum computing (Jay Gambetta from IBM, Dave Bacon from Google Quantum AI, Peter Shor from MIT, etc.) were interviewed. A significant number of experts agree that, considering the goal of implementing a quantum computer with roughly 100 logical qubits in the next 15 years, the leading candidates are superconducting systems and trapped-ions. To the question “Please indicate how likely you estimate it is that a quantum computer able to factorize a 2048-bit number in less than 24 hours will be built within the next 5 years, 10 years, 15 years, 20 years, and 30 years.”, just over half of the experts answered that within 20 years the likelihood is greater than 70% and in 30, greater than 95%.

Although inherent challenges to quantum computing are significant, they are not insurmountable. Relying on expert opinion, it is not unreasonable to expect that in 10 years we could be at the point that cryptographic systems used in communications are at risk, so we must start preparing for the quantum future.

This article was written by Núria Costa (PhD), Cryptography Researcher at Scytl.

--

--

EDGE Elections
EDGE Elections

Published in EDGE Elections

EDGE: Elections in the Digital Era — A blog by Scytl at the intersection of democracy and technology.

Scytl
Scytl

Written by Scytl

The global leader in secure online voting and election modernization software solutions. www.scytl.com

No responses yet