Rethinking Priorities in Quantum Computing

Are there algorithms that can leverage the laws of quantum mechanics to solve difficult problems?

RAND
RAND
4 min readAug 21, 2024

--

By Nicolas M. Robles

Concept of a quantum computer CPU. Photo by Olemedia/Getty Images
Photo by Olemedia/Getty Images

The emergence of quantum computing suggests limitless possibilities in chemistry, material science, simulations to predict outcomes, and other areas, but, first, research on quantum algorithms must be prioritized to understand the feasibility, generality, and advantages of quantum computing over classical computing. Even if sufficiently good quantum hardware is built, are there algorithms that can leverage the laws of quantum mechanics to solve difficult problems?

Classical computers use on (1) / off (0) bits as the basic units of information to arrive at the best solution to a problem. Quantum computers employ quantum bits, known as qubits, which can represent information in both 0 and 1 states simultaneously. Known as superposition, this can allow all the solutions to a problem to exist at the same time. Qubits can be correlated with each other in such a way that the state of one qubit depends on the state of another, even when the qubits are physically separated. This is the principle of entanglement, a foundational element of quantum computing.

In a radical departure from classical physics, within quantum physics there are no certain outcomes; one must deal with probabilities of obtaining certain results from a series of measurements. This requires researchers to single out desirable outcomes of a measurement by increasing the probabilities of desirable outcomes (the solution to the given problem) and decreasing the probabilities of unwanted outcomes via a process known as interference. For instance, in solving a tough combinatorial problem in logistics such as route planning or resource allocation where all the possibilities (e.g. routes, and allocations) related to this problem exist in the superposition, interference can help zero-in on the desired solution amongst these possibilities.

The unique attributes of quantum computing allow quantum computers to perform certain types of calculations much more efficiently than classical computers.

These unique attributes of quantum computing — superposition, entanglement, and interference — allow quantum computers to perform certain types of calculations much more efficiently than classical computers. While quantum has some clear advantages, challenges are hampering the development of algorithms that can successfully be solved with quantum computing.

The hardware challenge is a significant obstacle in the race to deliver effective quantum computing, but it may not be the hardest one. It is unclear if or when algorithms will completely deliver solutions even if they are running on flawless hardware. And there are many situations in which the mapping of the problem at hand into the quantum language requires a compromise.

For example, the Harrow-Hassidim-Lloyd (HHL) algorithm solved with a classical computer is a taxing, time-consuming process but does result in a full solution. Running the HHL algorithm through a quantum computer quickly produces a partial solution, limited to what can be observed. In this case, the quantum advantage is not guaranteed due to the difficulty in loading (not solving) the problem we wish to solve into the quantum computer.

A serious roadblock to the success of quantum machine learning has been the efficient preparation of the data required to run a quantum computer. Algorithmic research will be instrumental to understanding this step in the process.

The same holds true for physical and chemical applications, which typically involve solving a differential equation that governs the evolution of a physical or chemical process. While computing the evolution of these processes with a quantum computer and a suitable algorithm could be exponentially fast, extracting the solution from the quantum computer presents a severe limiting factor. The extraction can reduce the overall speed from exponential to quadratic, and the resulting solution is once again only partially complete.

A serious roadblock to the success of quantum machine learning has been the efficient preparation of the data required to run a quantum computer.

In the early days of research on quantum machine learning, it was assumed that machine learning tasks could be broken down into their linear-algebraic components and that quantum computers — which excel at matrix operations — would improve artificial intelligence (AI) by employing native quantum AI algorithms. Today, that theory has somewhat reversed: how can classical AI and large language models be of service to quantum computing?

The previously mentioned quantum algorithms have come close to supplying the answer to a difficult problem, but they either fall short of delivering a full solution or require so much overhead that they provide only minimal advantage over classical computers.

Although quantum mechanics has been verified to high degrees of accuracy, it remains a mysterious theory. Most quantum computing situations so far have been restricted, either because of a substantial compromise, as in the case of HHL, or due to the time and effort required in data loading which results in a reduced advantage.

By prioritizing efforts to understand and develop quantum algorithms, the potential advantages of quantum computing can be better understood. But the critical question of whether quantum computing is feasible must be answered before investing more in building quantum computers.

Nicolas Robles is a mathematician at RAND.

This originally appeared on rand.org on July 12, 2024.

--

--

RAND
RAND

We help improve policy and decisionmaking through research and analysis. We are nonprofit, nonpartisan, and committed to the public interest.