Quantum Computing — Exploration and Use cases

Gayatri Mohan
Dialogue & Discourse
9 min readMay 13, 2019

--

In the previous post, we looked at the basics of Quantum Computing and reasons behind why this trend is gaining so much momentum. Numerous organizations, big and small, around the globe, have invested in quantum computing. Active areas of research and commercial development span the entire range across hardware, software, and quantum computing specific algorithms. Some organizations are looking at addressing specific kinds of problems while some others are trying to create systems that could solve all problems. This post describes the various experiments that are being done in this field and potential applications.

Qubits and Quantum Computers

Qubits, the quantum equivalent of the classical bits, are key elements in any quantum computer. A qubit can be thought of as the simplest two-level (or two-state) system that displays quantum mechanical phenomena. Examples of qubits in the physical world include electrons with their two spin states (spin up & spin down) and polarization of photons with vertical and horizontal polarization as the two levels. .

From a technological perspective, the challenge is to create qubits in a stable computing environment with long enough coherence times so that quantum computing operations can be performed. Several different approaches and technologies are being actively tried out to create large-scale commercially viable quantum computers.

One set of approaches uses electrons and their spin up and spin down to create a so-called “spin qubit”. Another method uses photons and light polarization, though this approach faces significant challenges in creating and maintaining single pure photons with a long decoherence times. The most common and currently successful implementations are the Josephson Junction superconducting method and the trapped ion approach, both of which we sketch out below. A more detailed discussion on the maturity of the various qubit implementation technologies is available here.

One common approach leverages the Josephson Junction, where two pieces of superconducting material sandwich a non-superconducting barrier between them. In this scenario superconducting electrons can tunnel right through the barrier from one superconducting material to the other. One of the superconducting materials can be charged and the other uncharged thus creating a two-level quantum system. Other Josephson Junction implementations use current flux or energy phases, instead of electric charge, to create the two levels needed for a quantum system. This approach of qubit implementation has been fairly successful in creating qubits with long coherence times.

Another currently successful approach is the trapped ion quantum computer. Here, ions are confined and suspended in an enclosed space using electromagnetic fields. The stable electronic states of the ions are used to create the levels of qubits, and field waves such as laser and microwaves for logical gate operations.

Quantum Supremacy

One class of companies focus on Quantum Supremacy trying to come up with general purpose Quantum Computers. Quantum Supremacy denotes the potential ability of quantum computing devices of the solving problems that classical computers cannot. These computers are universal gate based circuits which take the input data and transforms it into a unitary operation and specify them as a sequence of gate operations and measurements and then arrive at quantum circuits. This calls for a very highly abstract mathematical model of the device being built¹. This concept that might take at least 2–3 decades for a meaningful implementation, unless a technology breakthrough happens.

IBM Q (Photo Credit : IBM)

Use-case Based Quantum machines

Another class of companies focus on a narrow set of problems that could be addressed through the Quantum Annealing. Quantum Annealing is a metaheuristic to find a global minimum of a function from a set of candidate solutions by the method of quantum fluctuations. It is a physical process that attempts to use models like the Adiabatic Quantum Computation, prime factorization algorithm being one of them. Quantum Annealers are machines that have hardware pre-built to the problem definition

D-Wave Annealer (Photo Credit : D-Wave)

Quantum Simulations

Yet another group of companies are exploring a different approach to solving large computations either through quantum simulations or through quantum inspired approaches.

· Quantum simulation has reached a point where circuit simulation has gone beyond 49 Qubits. It demonstrates that the computation of quantum amplitudes for measured outcomes for quantum devices of approximately 50 qubits is possible with existing classical computing resources².

· Quantum inspired approaches are those models that follow the properties of quantum mechanics but perform quantum algorithms on a classical computer. These class of companies leverage the power of the quantum mechanical properties to run algorithms faster than classical counterparts. These are typically done through ASICs.

· Some startups are building custom algorithm for specific business problems — software that could run on hardware that is either quantum or classical.

· Variational quantum computing / Quassical computing is gaining momentum due to the nature of its approach to solving problems. They are classical endpoints — inputs and outputs to the system that simulates / runs a quantum algorithm.

List of available quantum algorithms can be found here

Fujitsu Digital Annealer (Photo Credit : Fujitsu)

Cluster State Model

Investigation is happening on the cluster state model of the quantum computation. This model does not make use of the unitary operations but rely on the qubits in a cluster state, highly entangled states. Processing takes places through a series of single-bit measurement and hence the entanglement between qubits decrease during the course of the algorithm. This model is hence called as one-way quantum computation. It has been shown that the cluster state quantum computation is computationally equivalent to the standard circuit model³.

Reality! or are we ready yet?

Decoherence

But all these giant quantum physics experiments are not fault tolerant. The qubits are unable to retain their complex coherent state for a longer duration and the wave-function collapses into one of the eigenstates(a state against a fixed value at which the system is observed). This process is called as decoherence. The end state resembles a time-independent classical state. Decoherence happens when the qubits are impacted and hence interact with the environment. The only way to observe/measure the state of a vector/qubit is to in some way cause the quantum mechanical system to interact with the environment. And when the vector is observed, it makes a sudden discontinuous jump to one of eigenstates leaving the state vector collapsed. Knowing that a state vector will collapse when it interacts with the external environment, we still need to know in what manner this collapse happens. To perform any sort of useful calculation we must be able to say something about which base state a quantum mechanical system will collapse into⁴. Absence of this phenomenon leads to a decoherence and a system that is not fault tolerant. A recent research has shown some fault tolerant schemes that provide a systematic way to improve results when errors are dominated by internal control and measurement but not when the errors are dominated by decoherence⁵.

Fault Tolerance

The area of Quantum error correction (QEC) is being researched and there are a few error correction codes that researchers have come up with and are being experimented by organizations. The objective of quantum error correction is to encode information stored in a single physical qubit, into multiple qubits. This brings in the concept of physical and logical qubits. A logical qubit is collection of multiple physical qubits that contain encoded information of a physical qubit. Current physical to logical qubit ratio is at 10⁴: 1. This shows an exponential increase in the number of logical error corrected qubits necessary to perform a meaningful fault tolerant calculation in a system that does suffer early decoherence. Computation power increases only when the qubit counts increase exponentially.

What the future holds….

…for industries and businesses

Several industries have started embracing the theories and experimentation that are currently happening today in the quantum computing field. Quantum computers will be of immense benefit in the fields of material science and pharmaceuticals. The current methods of drug discovery are heuristic and a very challenging process. Pharmaceutical companies must screen tens of hundreds of thousands of compounds against the targets subject to the therapy. This involves computational complexities that memory / processing intensive procedures to identify the solution compound for specific problems. This can even go to the next level for identifying individual (tailored) drugs for patients as each patient and his/her disease is unique to himself/herself ⁶. For Protein folding, Monte Carlo Simulations can be done on Quantum Computers through the process of Quantum Amplitude Amplification (an extension of Grover’s algorithm).

Quantum Computing finds immense use in Agriculture, one of the key ability is to create / replace ammonia in fertilizers. Currently the combinations are infinite and not solvable by classical computers without heuristic methods. For today’s supercomputers to digitally test for the right catalytic combinations to make ammonia, it would take centuries to solve.

Another industry that will see major breakthrough is the Financial services. They focus on leveraging the quantum annealing process for their portfolio optimization. Most financial problems have optimization at the core of their complexity and are classified as NP-hard (finding the shortest path in a traveling salesman problem is an NP-hard optimization problem). Simple cases of optimization can be solved by classical linear programming but as the complexity increases, the solutions can be arrived through simulated annealing. Quantum annealing is similar to simulated annealing — quantum tunneling is an equivalent of the thermal fluctuations in the classical world for the system to jump between the local minima. Some of the optimization problems like the minimum spanning tree, maximizing flow-like variables and implementation of Monte-Carlo methods can be solved using adaptation from the Grover’s algorithm.

Quantum Approximate Optimization Algorithm is believed to provide a good solution to optimization problems in polynomial time, something that takes exponential time in classical computers⁷. Some other implementations include the quantum equivalent of a Fast Fourier transforms — the Quantum Fourier transform that reduces the exponential complexity from N to 2 (from N log N to 2 log N).

Stock Markets see direct benefits as Quantum Computing may disrupt financial arbitrage and algorithm trading mechanisms. Finding a profitable arbitrage could be translated to QUBO (Quadratic Unconstrained Binary Optimization) that could be solved using a Quantum Annealer

As we see many of the algorithms are being recreated / defined / designed to be able to work on Quantum computers. Some of them don’t even have a classical counterpart. Many startups are going ahead of the curve to build the algorithms that could be run on quantum hardware — focusing on understanding customer needs and build custom algorithms to solve their problems.

…for the world

The world waits for Quantum Supremacy to evolve. While we wait, Quantum Hardware (Quantum Annealers & Simulators) that solve specific use-cases are being built. Meanwhile, a new era called NISQ — Noisy intermediate-scale quantum systems is evolving. NISQ systems do not completely eliminate noise, but minimize it. Quantum Computers that can operate with 50–100 qubits could be created. Though it would not change the world right away, these computers would surpass the capabilities of the current classical computers but the noise in the circuits will limit the complexity of the calculations that can be performed⁸. An optimistic way to look at it is to identify problems that could be solved using these computers. A more realizable approach is to augment quantum computing with a classical preprocessing — a Quantum-as-a-service Model. An interesting and a feasible approach talks about Quassical Computing — an architecture that combines the capabilities of the quantum and the classical architectures⁹. Ahead of the curve and a nearer milestone is to be able to perform simulation models — this involves running quantum algorithms on classical hardware — an approach called quantum inspired approach.

References

1. https://www.sciencedirect.com/science/article/pii/S0010465517301935

2. https://arxiv.org/pdf/1710.05867.pdf

3. http://mmrc.amss.cas.cn/tlb/201702/W020170224608150244118.pdf

4. https://pdfs.semanticscholar.org/8072/dc7247460849b18abbb463429a09cfb2e3e6.pdf

5. https://arxiv.org/abs/1805.05227

6. https://www.sciencemag.org/news/2017/09/quantum-computer-simulates-largest-molecule-yet-sparking-hope-future-drug-discoveries?r3f_986=https://www.google.com/

7. https://arxiv.org/abs/1807.03890

8. https://arxiv.org/abs/1801.00862

9. https://arxiv.org/abs/1801.00862

--

--

Gayatri Mohan
Dialogue & Discourse

Data and Solution Architect …..with a passion for the Quantum world