Quantum Neural Networks

Amzhao
MIT 6.s089 — Intro to Quantum Computing
8 min readFeb 5, 2023

How are quantum neural networks built, and do they pose an advantage over classical neural networks?

By Emily Chen & Angela Zhao

Introduction & Motivation

Both quantum computing and neural networks are subjects of burgeoning research, making it unsurprising that the combination of the two is a topic of high interest. The concept of quantum neural networks first appeared in publications by Subhash Kak and Ron Chrisley in 1995, in relation to the idea of the quantum mind, which proposes that classical mechanics is incapable of explaining the human consciousness. However, modern-day work with quantum neural networks focuses on adapting the classical artificial neural network to incorporate quantum advantages. An important motivator for quantum neural networks is the difficulty in training large classical neural networks, particularly in big data applications. Quantum computing offers the possibility to exponentially simplify the complexity of training current neural networks. It not only holds practical potential but could hold the key to further understanding the human consciousness as well. In this article, we seek to explore the research problem of how quantum neural networks are built, trained, evaluated, etc. and to explore what advantages a quantum neural network might have over a classical one.

Overview of Classical Neural Networks

Before we dive into how to build quantum neural networks, we will give a brief overview of classical neural networks: how they work, how we use them, and why they are so important. Biologically, a neural network is a group of neurons, all connected by synapses. Neural networks through an Artificial Intelligence lens similarly connect nodes (“artificial neurons”) and have a wide range of capability.

As seen above, neural networks work as follows: we send in a set of inputs, process those inputs through a series of hidden layers and activation functions, and end with a set of outputs. We train these neural networks by training and adjusting parameters according to some cost function. By building a neural network, we are teaching the computer how to process data to produce the output that we want. This turns out to produce powerful results — it helps with language translation, image processing, handwritten text recognition, etc.

Limitations of Quantum: Perceptrons

In order to understand how the classical structure of neural networks can be applied to quantum computing, we have to understand the concept of perceptrons. Perceptrons are a commonly used threshold function in neural networks to help inform decision-making. Because of the probabilistic nature of quantum, however, it becomes difficult to discover a quantum interpretation of a perceptron. Existing ideas include an attempt at non-linear quantum operators, but this difficulty demonstrates how classical neural networks can’t translate perfectly into quantum ones.

Above is a description of a classical perceptron. Finding a correct and accurate quantum generalization of the classical perceptron is one of many hurdles faced by researchers hoping to develop quantum machine learning algorithms, and it is an area of current research. Several variations of quantum perceptrons have been proposed — this includes refs, where a qubit circuit setup is exploited as a perceptron, and continuous-variable quantum systems (light). Ultimately, different quantum perceptron structures are optimized for different use cases.

How Quantum Neural Networks are Built

Variational Quantum Circuits

Recent developments in variational quantum circuits (VQC) have opened up a world of possibilities within quantum computing, including those related to quantum machine learning. Variational quantum circuits are circuits that utilize rotation operator gates with free parameters to perform various calculations. They are similar to artificial neural networks in that they approximate functions through parameter learning, meaning that with a few modifications, they can be used as quantum neural networks.

Since all quantum gate operations are reversible linear operations, we can replace a neural network’s activation functions with entanglement layers, giving them a multilayer structure.

On a high level, the steps for VQC quantum neural networks to process data are as follows: they first encode their input values into an appropriate qubit state, then transform this qubit state through its parametrized rotation gates and entangling gates, and finally measure this transformed qubit state by calculating the expected value of the system’s Hamiltonian operator. From here, the output expected value is then decoded into a more appropriate output, and the parameters of the quantum neural network are then updated using an appropriate optimizer.

Feed-Forward Networks

The majority of current-day quantum neural networks are developed as feed-forward networks. Feed-forward neural networks (FNNs) refer to neural networks in which the connections between nodes do not form a cycle. FNNs were the first and simplest neural networks to be developed, and the name comes from the fact that information only moves forward within an FNN: from the input nodes, through the hidden nodes, and to the output nodes.

A general quantum feed-forward neural network takes on the following structure, consisting of an input layer, L hidden layers, and an output layer. In particular, we will review the quantum neural network architecture proposed by Beer, K., Bondarenko, D., Farrelly, T. et al (2020).

Beer et al propose an arbitrary unitary operator to act as their quantum perceptron, with m input qubits and n output qubits. This perceptron is then an arbitrary unitary acting on these m + n input and output qubits, depending on (2^(m+n))² — 1 parameters. These perceptron unitaries are applied layerwise from top to bottom, thus composing the architecture for the QNN.

With these quantum neurons, Beer et al propose a quantum neural network of quantum perceptrons organized into L hidden layers of qubits.

The quantum circuit can be expressed as

where each term represents the unitary for a given layer l. These layer unitaries are composed of a product of quantum perceptrons that act upon the qubits in the layers (l-1) and l. This quantum circuit acts upon an initial state of input qubits 𝜌(in) and produces a mixed state of output qubits 𝜌(out), corresponding to a probabilistic mixture of pure states.

Cost Function and Optimizing Cost

Just like in classical neural networks, QNNs need a cost function to be trained. In other words, we need to determine a metric that tells us, given inputs and outputs, an objective measure of “closeness” between our network output and desired output. It is defined as follows, where φ represents the input and expected output, and 𝜌 represents the network output.

In line with what we know about inner products, we expect a value close to 1 if our output is close to our expected output. Thus, a cost near 1 is optimal and a cost near 0 is poor. Thus, when we build QNNs, we will be working to maximize the value of the above cost function. As with classical neural networks, we change the parameters in order to maximize ΔC most rapidly to optimize our neural network.

Although we cover just feed-forward quantum neural networks, research is still being done in proposing new architecture types, including quantum recurrent neural networks (RNNs) to address time series prediction, multilayer perceptron neural networks, quantum-enhanced Markov chain Monte Carlo, and more.

Quantum vs. Classical Neural Networks

Quantum computers hold great power because they are capable of faster computation than their classical counterparts. On that lens, it is important to analyze whether quantum neural networks hold a similar power. IBM describes how the capabilities of a neural network can be described through “effective dimension”, a measure of how useful and non-redundant the neural network is. Then, it was found that the quantum neural networks produced noticeably higher effective dimensions and produced lower loss quicker. Even though there is so much more we don’t know about how quantum neural networks might perform comparatively, these results are promising — just like in other fields of computation, machine learning could be by the power of quantum.

Future Directions of Quantum Neural Networks

Much work with quantum computing is still in its infancy, and this still applies in the case of quantum neural networks. Future work will certainly bring faster, more efficient proposed architecture models for quantum neural networks, designed to address a wider variety of quantum machine learning problems. Just as classical neural networks offer countless applications, their quantum counterparts offer the possibility of drastically accelerating the very same applications.

Recent advances have already demonstrated the promise of QNNs in the fields of generative learning, information processing, and more. It seems like the capabilities of quantum computing and the potency of machine learning meet at a powerful intersection — quantum could further revolutionize the field of machine learning/artificial intelligence to continue breaking boundaries.

Works Cited

Abbas, Amira, et al. “The Power of Quantum Neural Networks.” IBM Research Blog, IBM, 3 Aug. 2022, https://research.ibm.com/blog/quantum-neural-network-power.

Beer, Kerstin, et al. “Training Deep Quantum Neural Networks.” Nature News, Nature Publishing Group, 10 Feb. 2020, https://www.nature.com/articles/s41467-020-14454-2.

Cui, Wei, and Shilu Yan. “New Directions in Quantum Neural Networks Research — Control Theory and Technology.” SpringerLink, South China University of Technology and Academy of Mathematics and Systems Science, CAS, 6 Nov. 2019, https://link.springer.com/article/10.1007/s11768-019-8289-0.

Kwak, Yunseok, et al. “Quantum Neural Networks: Concepts, Applications, and Challenges.” 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), 2021, https://doi.org/10.1109/icufn49451.2021.9528698.

Kwak, Yunseok, et al. “Quantum Neural Networks: Concepts, Applications, and Challenges.” 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), 2021, https://doi.org/10.1109/icufn49451.2021.9528698.

“Quantum Neural Network.” Wikipedia, Wikimedia Foundation, 16 Nov. 2022, https://en.wikipedia.org/wiki/Quantum_neural_network.

Siemaszko, Michal, et al. “Rapid Training of Quantum Recurrent Neural Network.” 1 July 2022.

Vidwans, Atharva. “Classification Using VQC with Custom Variational Ansatz.” Medium, Predict, 28 Apr. 2021, https://medium.com/predict/classification-using-vqc-with-custom-variational-ansatz-c7c45fb699a1.

“What Is a Neural Network?” TIBCO Software, https://www.tibco.com/reference-center/what-is-a-neural-network.

--

--