Highlighting Quantum Computing for Machine Learning

Riahi LOURIZ
Meetech - We Love Tech
8 min readSep 25, 2019

Introduction

The pace of development in quantum computing mirrors the rapid advances made in machine learning and artificial intelligence. It is natural to ask whether quantum technologies could boost learning algorithms: this field of inquiry is called quantum-enhanced machine learning. The goal of this article is to provide a detailed presentation on Quantum computing starting from defining what is a QuBit until showing what benefits current and future quantum technologies can provide to machine learning, focusing on algorithms that are challenging with classical digital computers.

Quantum Bit (QuBit) ?

QuBits are the basic manipulation elements of information in quantum computers. They oppose the bits of classical computing. With them, we move from a deterministic world to a probabilistic world. The following figure compares classical data to quantum data.

|0> and |1> are the notation used to denote states 0 and 1 respectively

The main elementary particles of quantum mechanics are: photons and electrons.

The most known quantum effects are manifested as: phase of photons, Energy level or spin direction of electron.

Bloch’s Sphere : a mathematical representation of the Qubit

This model is linked to the representation of the state of a QuBit or of any quantum with two states by a two-dimensional vector whose so-called “norm” length is always 1. This vector has the particularity of having two elements: a real number α and a complex number β.

The main idea to keep in mind from this mathematical representation is that at any time, except the initialisation and the moment of reading, a QuBit can be written as a superposition of two states as the following :

|ψ> = α |0> + β |1>

Comparison of QuBits with classical Bits ?

A QuBit is the basic unit of quantum information — the quantum version of the classical binary bit physically realised with a two-state device. A QuBit is a two-level quantum-mechanical system. The below figure shows the main differences between a Bit and a QuBit.

Quantum registers vs classical registers

Once we have got an idea about the difference between a Bit and a QuBit, we are going to explore the registers. Indeed, one Qubit can do nothing alone … we need more to be able to represent more data. Here is an example using a register of 4 QuBits :

The table below gives a detailed overview of a register of n QuBits compared to a register of n bits.

These 2 of power n states, however, do not really correspond to a capacity of information storage. It is a state superposition capability that is then applied to treatments to highlight the combinations that we search according to an algorithm given.

The three most important basic principles that are used in QuBits :

  • Superposition : allows to have QuBits that are both in a state 0 and 1.
  • Entaglement : The entanglement makes it possible to connect the QuBits to each other to synchronize them, which makes it possible in particular to make copies of them, but without being able to read their contents or modify them independently.
  • Wave-particle duality : the wave-particle duality makes it possible to interact in some cases with the QuBits or to make the QuBits interact with each other by interference in the context of quantum algorithms.

Superposition → information in QuBits

Entanglement → connection between QuBits

Wave-particle duality → interference in QuBits

Global architecture of a quantum computer :

The big classes of quantum algorithms

  • Search algorithms : search algorithms based on those of Deutsch-Jozsa, Simon and Grover.
  • Algorithms that seek a balance point of a complex system : algorithms that seek a balance point of a complex system like in the training of neural networks, the search for an optimal path in networks or process optimization.
  • Algorithms based on Quantum Fourier Transforms (QFT) : such as Shor’s Integer Factorization, which triggered a debate between people who are wanting to create quantum computers capable of breaking public RSA-type security keys, and people being the ones looking for to protect digital communications with algorithms resistant to the fast factorization of integers.
  • Simulation algorithms of quantum mechanisms : which serve in particular to simulate interactions between atoms in various molecular structures, inorganic and organic.

Reduction of the wave packet : reduces the power of acceleration on some classical algorithms

The reduction of the wave packet is a fundamental notion of quantum mechanics and thus of quantum computing that, after a measurement, a physical system sees its state completely reduced to the one that has been measured. This notion of wave packet reduction involves many difficulties on the implementation plane of quantum algorithms, and more particularly the parallelisation of intermediate computations, which is the main gain of the quantum. Concretely, to be able to implement a classical algorithm on a quantum computer it is necessary to find an implementation which takes advantage of the power of the parallelisation only in the intermediate calculations and giving only one result. The following diagram illustrates this idea:

After going around the basic element of quantum computing and shed light on the difficulty of implementations related to the concept of reduction of the wave packet. We will now talk about what quantum can bring to Machine Learning.

Quantum Machine Learning

Before diving into the definition of Quantum Machine Learning, let’s first define the concept of Machine Learning and Quantum Computer :

  • Machine Learning:is how computers learn patterns in data.
  • Quantum computing: is the use of quantum mechanical phenomena such as superposition and entanglement to perform computation. A quantum computer is used to perform such computation, which can be implemented theoretically or physically.
  • Quantum Machine Learning: is about how quantum computers and other quantum information processors can learn patterns in data that cannot be learned by classical machine learning algorithms.

Why might we hope that quantum mechanics might help in learning patterns in nature ?

For two reasons :

  • Quantum systems are famous for generating strange and counter-intuitive patterns that cannot be generated by any classical system, including a classical computer. Quantum mechanics is weird. Now, we can reasonably hope that in addition to be able to generate such strange and counter-intuitive patterns that quantum computers could also recognize strange and counter-intuitive patterns that cannot be recognized by any classical computer. So quantum weirdness might allow us to help in understanding the weird features of nature itself.
  • Is about the fact that a large amount of the classical algorithms for finding patterns in nature of classical machine learning is based on linear algebra on the manipulation of high-dimensional vectors of data. And quantum mechanics is entirely about linear algebra.

Classical Machine Learning VS Quantum Machine Learning

If we want to learn a machine learning algorithm on a classical data set using quantum computing, we need to be able to convert this dataset into quantum data (quantum states) and then apply the quantum algorithms on it. The main advantage is about Timeliness.

Comparison of complexity for BLAS (Basic Linear Algebra Subroutines) and Quantum BLAS

Here is a comparison between the most know used subroutines in the field of Machine Learning and those used in Quantum Machine Learning :

These quantum subroutines will dramatically reduce the computing power of the majority of self-learning algorithms. The key point here for this kind of quantum algorithms used in Machine Learning is to show how you can take a large classical vector of data and to map it into a quantum state. And for this, you need a device that is going to take classical data defined, for instance, in the mirrors on a CD (Compact Disk), or classical data defined in pulses of light that are going down a fiber optic cable or classical data that’s stored in capacitors in an integrated circuit. And you have to be able to map it into a quantum state. The device that will do this for you is called a quantum Random Access Memory, or qRAM.

Quantum speedups in Machine Learning

Source

In the above table, speedups are taken with respect to their classical counterpart(s) — hence, O(√N) means quadratic speedup and O(log(N)) means exponential relative to their classical counterpart.

Keep in mind that there are some algorithms that presents caveats that can limit their applicability

Quantum’s application in Machine Learning

Limits and discoveries in Quantum Deep Learning

1 — Main obstacles limiting quantum growth in the deep learning area

  • The first obstacle to quantum neural networks was the lack of a real quantum computer to experiment with
  • The second obstacle was the impossibility of training quantum networks
  • The third problem, that the classic neuron/perceptron has nonlinear functions, putting it in conflict with the quantum qubits that operate only with unity and linearity

2 — Main discoveries have changed these obstacles

  • Recently several companies have delivered quantum computers in the last year, including IBM, who has made theirs available to researchers for free over the Internet
  • A new algorithm now solves that problem using two main steps : Simple Quantum Neural Network and Quantum Neural Network training
  • This issue has been solved with a new quantum perceptron using a special quantum circuit, the Repeat-Until-Success (RUS) circuit

Conclusion

Quantum mechanics is weird. Quantum systems can generate weird patterns that are very hard to generate classically. But quantum systems can also learn and recognize weird patterns that can’t be recognized classically. And the reason that they can do this is that quantum computing and quantum mechanics is all about linear algebra. But with a very small number of quantum bits, we can have states that represent vectors in a very high-dimensional vector space. And then quantum computers can be used to do linear algebra operations such as Fast Fourier Transforms, Finding Eigenvectors and Eigenvalues, and Inverting Matrices exponentially faster than we can do classically.

For more information about this subject, you would find my main references that I have explored to write this article :

--

--

Riahi LOURIZ
Meetech - We Love Tech

Consultant — Data Scientist Machine Learning Data Lab Wavestone