Machine Learning: The Quantum Way

Are you waiting 3 months to train your neural network to predict whether a picture is of a cat or a dog? Well, the solution might be quantum machine learning.

Increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to some impressive results, but now the datasets are only going to get bigger and the algorithms only get more complex.

The exponential rise in processing power every year for the last 50 years is finally coming to an end. The physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to start exploring the possibility of quantum computation.

What is quantum computation?

Since the very beginning of our discovery of the atom, quantum physics has been defying logic as atoms don’t follow the traditional rules of physics we are used to. Capable of moving both backward and forwards in time, quantum particles can exist in two places at the same time and even ‘teleport’.

Quantum computers are aiming to utilize these capabilities to become highly efficient, as they use quantum bits or qubits instead of the simple combination of ones and zeros. Bits, used in our modern computers, can only be either one or zero, but not both.

A qubit acts as both a particle and a wave. A wave distribution stores a lot of data, as compared to just a bit. Although quantum computers also use ones and zeros, the qubits have a third state titled ‘superposition’ that permits them to represent both a one and a zero at the same time. Four scenarios can be represented at the same time through superposition by using two qubits, which results in a reduced data-crunching time. By exploiting interference effects, quantum computers are able to simultaneously evaluate a function on every point of its domain.

A qubit in a mixed state where it is a combination of both a 1 and a 0

Quantum Machine Learning

This leads us to Quantum Machine Learning, a still theoretical field that’s just starting to develop. It’s a combination of Quantum Computing and Machine Learning.

Applications of quantum machine learning

  1. With data having millions of columns, you can take the quantum version of it and decompose it into its principal components. Both the computational complexity and time complexity are thus reduced exponentially.
  2. In classical computers, some algorithms can be performed only up to a certain number of dimensions. More than that, it will require too much processing power, which these computers do not have. Quantum Computers, however, can perform these algorithms at an exponentially faster rate. The principle of Superposition and entanglement allows it to work efficiently and produce results faster.
  3. Because of the way qubits work, rather than having a clear position, they occur in a mixed ‘superposition’, just like how a coin spins in the air before it lands on your hand. Which allows it to have the potential to process exponentially more data compared to classical computation

Implementing Quantum Support Vector Machines

Classification algorithms are essential for pattern recognition and data mining applications. Recently due to the amazing advances in classical hardware computational capabilities and speed, classification algorithms like support vector classification have become very popular.

Support vector classification is based on a very natural way that one might attempt to classify data points into various different labels. If the labels in our training data can be separated by some obvious boundary or hyperplane, then we can just classify the data depending on what side of this plane the data lies on.

Here several boundary lines were created to accurately separate the two classes, the red dots and the blue dots. If a new dot was introduced and it was above the boundary line, then we know it’s a blue dot and vice-versa.

The above example was 2-D, but realistically the datasets we will be working on will be multidimensional, a common technique used to find such a hyperplane consists of applying a non-linear transformation function to the data. This function is called a feature map, as it transforms the raw features into a new set of relevant features and therefore reduces the number of features in the dataset which would reduce the time taken to train the algorithm. In real applications, there might be many features in the data, and applying transformations that involve many polynomial combinations of these features will lead to extremely high and impractical computational costs.

Below is an example of a classification problem that requires a feature map for which computing the kernel is not efficient in the classical machine learning sense, this means that the required computational resources are expected to scale exponentially with the size of the problem. We show how this can be solved in a quantum processor by direct estimation of the kernel in the feature space.

Qiskit

Qiskit is an open-source software development kit for working with quantum computers at the level of circuits and algorithms. Basically, it allows you to run your quantum algorithms on prototype quantum computers from the comfort of your home. How it works is, first a Provider gives us access to a group of different backends to find out which of these are available. The backend may be either simulator that is included in Qiskit and run on your local machine or it might be cloud-based quantum devices hosted by IBM.

Implementing The Quantum Support Vector Machine Algorithm

Now we run our algorithm with a real-world dataset: the breast cancer dataset. This is a dataset provided by scikit learn. The data has 30 features (mean radius, mean texture, ….), and is classified in two ways, malignant and benign.

After installing the required libraries, qiskit and qiskit.aqua, we will first import all the required libraries.

With these libraries, the dataset can now be loaded. From the dataset we take samples for use as training, testing and the final prediction.

OUTPUT

With the dataset ready we can set up the Quantum Support Vector Machine algorithm. Here we use the ZZFeatureMap, which is a Quantum Feature Map. These maps convert Classical data to Quantum data. It takes a classical data point x and translates it into a set of gate parameters in a quantum circuit, creating a quantum state. To build the Support Vector Machine, a feature map has to be done. The provider Aer gives us access to the backend which comprises several simulators that are included with Qiskit and run on your local machine.

OUTPUT

Now we will implement a classic SVM algorithm to compare the two.

Conclusion

The Quantum Machine Learning model was far more accurate and much quicker than the classical machine learning model, but still, a lot more work has to be done, quantum computers have still not reached the point we need them to be at yet. Despite the efforts of the backend, it is not feasible for a classical computer to imitate a quantum computer without some type of high-performance computing power. However, progress is being made, IBM promises a 1000 qubit quantum computer by 2023, which would be a complete gamechanger.

Here are some recommended courses if you are interested in the concept of quantum computing and quantum machine learning.

--

--