Is Quantum Doomsday for Current Data Encryption Near?

Bernard Lin
ReBloc
Published in
5 min readJun 3, 2019

There are a lot of talks about the imminent threat of quantum computers to current encryption algorithms such as RSA and Elliptic Curve encryption algorithms, but how “imminent” the threat is?

Richard Feynman discussed quantum computers in his 1982 paper, “Simulating Physics with Computers.” Quantum computers are to make use of quantum mechanics phenomenon: entanglement and superposition.

In quantum computing, the classical two-state circuit element (the transistor) is replaced by a quantum element called a quantum bit, or qubit. Like the conventional bit, it also has two basic states. Although a variety of physical objects could reasonably serve as quantum bits, the simplest thing to use is the electron’s internal angular momentum, or spin, which has the peculiar quantum property of having only two possible projections on any coordinate axis: +1/2 or –1/2 (in units of the Planck constant). For whatever the chosen axis, you can denote the two basic quantum states of the electron’s spin as ↑ and ↓.

In a system with two qubits, there are 22 or 4 basic states, which can be written (↑↑), (↑↓), (↓↑), and (↓↓). Naturally enough, the two qubits can be described by a quantum-mechanical wave function that involves four complex numbers. In the general case of N qubits, the state of the system is described by 2^N complex numbers, which are restricted by the condition that their squared magnitudes must all add up to 1.

While a conventional computer with N bits at any given moment must be in one of its 2^N possible states, the state of a quantum computer with N qubits is described by the values of the 2^N quantum amplitudes, which are continuous parameters (ones that can take on any value, not just a 0 or a 1). This is the origin of the supposed power of the quantum computer, but it is also the reason for its great fragility and vulnerability.

How is information processed in such a machine? That’s done by applying certain kinds of transformations — dubbed “quantum gates” — that change these parameters in a precise and controlled manner.

In theory, quantum computers could speed up factoring significantly to solve factoring in polynomial time as demonstrated the algorithm invented by Peter Shor. However, quantum computers are not for every type of problems. They are suitable for problems that can be solved mathematically and classical computing are still required to interpret measurements from quantum computers into solutions, as explained by Peter Shor. To solve a problem, one needs to translate inputs and outputs through with interferometers to get information in and out of quantum computers and use classical computers to interpret results.

There are many big challenges in making quantum practical and commercially feasible, such as:

  • Extremely Sensitive to Surrounding Noise: Any interaction (or measurement) leads to a collapse of the state function — which is called decoherence. It is extremely difficult to isolate a quantum system, especially an engineered one for a computation, without it getting entangled with the environment. The larger the number of qubits the harder is it to maintain the coherence.
  • Algorithm Design: Measuring the state of a quantum computer collapses the large quantum state to a single classical result, which means that one can extract only the same amount of data from a quantum computer that one could from a classical computer of the same size. For quantum computers to find solutions for breaking Elliptic Curve encryption significantly faster than classical computers, quantum algorithms must be able to use uniquely quantum features such as interference and entanglement, e.g., Peter Shor’s factoring algorithm. Only a few dozen algorithms have been developed after more than 25 years of research on quantum computing.
  • Error Correction: Classical gates, which operate on bits and are used to create computers, have very large noise margin; they can reject large variations in their inputs and still produce clean, noise-free outputs. Because a qubit in a quantum computer can be any combination of 1 and 0, qubits and quantum gates cannot readily reject small errors that occur in physical circuits. Small errors in creating the desired quantum operations, or any stray signals that couple into the physical system, can lead to wrong outputs appearing in the computation. It requires a large number of qubits to create error-free quantum computers. The most recent algorithmic breakthrough that demonstrates how quantum computing can solve 2048-bit RSA in 8 hours requires 20 million qubits. The most advanced quantum computer today is 70 qubits. The optimistic estimate to build quantum computers at such scale is at least 20 years, if ever possible.
  • New Software Stack: Tools are needed to create and debug quantum computer software. Quantum programs are different from programs for classical computers, research and development is needed to further develop the software tool stack.
  • Loading Large Data Inputs into a Quantum Computer Efficiently: A quantum computer can use a small number of qubits to represent an exponentially larger amount of data, but currently, there is not a way to convert a large amount of classical data to a quantum state rapidly, unless data can be generated algorithmically. For problems that require large inputs, the amount of time needed to create the input quantum state would dominate the computation time and greatly reduce the quantum advantage.
  • The Intermediate State of a Quantum Computer Cannot Be Measured Directly: Current debugging methods for classical computers rely on memory and the reading of intermediate machine states. Neither is possible in a quantum computer. A quantum state cannot simply be copied, per the so-called no-cloning theorem, for later examination. Any measurement of a quantum state collapses it to a set of classical bits and will bring computation to a halt. Developing new ways of debugging is required for large-scale quantum computers.

In conclusion, there is still a huge gap to bridge in making quantum computers that can break existing encryption schemes such as RSA. Some physicists such as Mikhail Dyakonov, and Gil Kalai are doubtful whether quantum computers of such scale would ever be possible within any foreseeable future. Many exaggerated claims should be debunked, such as this one: “If… you’re optimizing the lengths of aircraft routes or optimizing the layout of spare parts for a rail network, something where there are 2^n possibilities and you’ve got to try each out in order to find the optimal solution. If you had a 2^100 problem, which would be basically impossible to solve on a classical computer, with a 100-qubit quantum computer, you’d be able to solve it in one operation” (source).

The computing architecture has not changed much from John Von Neumann since 1945 — more than 70 years. Moore’s law is based on the microchip industry’s capability to decrease the size of chips that contain more and more transistors, but it does not apply in the quantum realm where the size of the scale is a single electron and the governing law is Quantum Mechanics*. It seems very unlikely that fault-tolerant quantum computers that could solve many NP problems, e.g. factoring, in polynomial time will be realized in 20–25 years, as some have said.

Note*: For example, a thought experiment like Schrödinger’s Cat.

Schrödinger’s Cat

Simple explanation videos:

https://youtu.be/UUpqnBzBMEE
https://youtu.be/g_IaVepNDT4

--

--