TRANSFORMATIVE BENEFITS OF QUANTUM MACHINE LEARNING

Laia Domingo Colomer
ingeniiquantum
Published in
7 min readJun 21, 2024

Quantum computing is poised to revolutionize the technological landscape, bringing solutions to problems once deemed unsolvable. In a previous post, we explored the fundamental differences between classical and quantum computers, highlighting the revolutionary potential of quantum algorithms. Among the many fields poised to benefit, machine learning stands out as one of the most promising areas for quantum enhancement.

WHAT IS QUANTUM MACHINE LEARNING?

Quantum machine learning (QML) combines the groundbreaking technologies of quantum computing and machine learning. By integrating quantum algorithms into the machine learning pipeline, we can achieve advantages such as exponential computational speed-ups, novel data representations, and improved data augmentation. While classical machine learning is already powerful, the addition of quantum computing opens up new possibilities for overcoming existing limitations.

The Bottlenecks of Machine Learning

Despite their impressive capabilities, classical machine learning models face significant challenges. One major bottleneck is the immense computational power required to train and deploy these models. As we enter the era of large-scale AI models, such as large language models (LLMs), the compute power demands skyrocket [1]. For instance, training a model like GPT-4 can cost over $100 million [2,3]. If this trend continues, some estimates suggest that by 2037, the compute costs could exceed the entire US GDP [4]. This scarcity of computational resources consolidates market power among a few key players, such as Nvidia, Google, Microsoft, and Amazon [1].

Another significant bottleneck is the curse of dimensionality. As the dimensionality of data increases, the complexity of finding useful patterns grows exponentially, often leading to overfitting and reduced performance of supervised learning methods. This necessitates new algorithmic techniques that can handle high-dimensional data accurately and efficiently.

All in all, we need new algorithmic techniques to handle high-dimensional data more accurately while also cutting down on the time and costs of training these models.

Source: Lennart Heim

HOW QUANTUM MACHINE LEARNING CAN HELP

Quantum computing offers a promising solution to overcome these challenges. The advantages of quantum machine learning vary based on the specific algorithm and its application. To better understand these benefits, let’s explore two main categories of quantum machine learning algorithms.

Based on Linear Algebra

The first category of quantum machine learning (QML) algorithms focuses on accelerating linear algebra subroutines. Before deep learning dominated the scene, many machine learning methods — such as support vector machines, principal component analysis, and clustering techniques — relied heavily on linear algebra. Interestingly, quantum computing is fundamentally rooted in linear algebra, allowing many of these subroutines to be significantly sped up with quantum algorithms. Here are some notable examples:

  1. Fourier Transform: Widely used in machine learning tasks like speech recognition, audio analysis, and vibration analysis, the Fourier Transform extracts frequency-based features from raw signals and aids in denoising, image filtering, and audio compression. Classically, it operates with a complexity of 𝑂(𝑁log⁡𝑁) for 𝑁 data points. The quantum version (Quantum Fourier Transform [5]) achieves this with a complexity of only 𝑂(log2⁡(𝑁)), offering an exponential advantage.
  2. Harrow-Hassidim-Lloyd (HHL) Algorithm: This algorithm [6] solves linear systems of equations, a common requirement in many machine learning models such as support vector machines and recommender systems, which often involve matrix factorization. Classical algorithms, like the Conjugate Gradient method, have polynomial complexity 𝑂(𝑁⋅𝑘) where 𝑘 is the condition number of the matrix. The HHL algorithm, however, operates with a complexity of 𝑂(log⁡(𝑁)⋅poly(𝑘,1𝜖)), where 𝜖 is the desired precision of the solution. This represents an exponential speedup in terms of dimensionality, assuming 𝑘 and 1𝜖 are manageable.
  3. Quantum Phase Estimation: This algorithm [7] is used to estimate the eigenvalues of a unitary operator. In spectral graph theory applications, such as spectral clustering, QPE can be used to find the eigenvalues of the graph Laplacian, significantly speeding up the clustering process based on spectral properties. Principal component analysis, a method for dimensionality reduction, involves solving eigenvalue problems, which can also be accelerated using QPE. The classical complexity of these problems is around 𝑂(𝑁2), while the quantum complexity can be reduced to 𝑂(log2⁡(𝑁)), providing an exponential advantage.

While these algorithms promise significant benefits over classical methods, they currently require advanced quantum hardware that is still under development. Therefore, while they hold the potential to revolutionize classical machine learning in the next five to ten years, their practical application remains largely theoretical at present.

Based on Data Processing

The second category of quantum machine learning (QML) algorithms focuses on data representation and processing rather than linear algebra subroutines. Instead of seeking exponential speed-ups, these algorithms aim for benefits such as higher accuracy, efficiency, energy savings, or even explainability. Let’s look at some examples:

  1. Encode Exponentially Large Data Samples: Quantum algorithms can encode and manipulate exponentially larger data samples than classical algorithms by leveraging the principles of superposition and entanglement, achieving up to an exponential reduction in the number of qubits needed. For example, the amplitude encoding method uses 𝑛-qubits to encode up to 2𝑛 values [8]. Different efficient encoding methods can be applied depending on the input data type. In image processing, the flexible representation of quantum images [8] offers an exponential reduction in qubits, and quantum relaxations [10] can be used in optimization problems to reduce qubit requirements. After encoding data in a quantum state, another quantum algorithm processes it, such as a quantum neural network or a kernel method, providing additional benefits.
  2. Reduced Complexity and Computing Costs: Some quantum machine learning models require significantly fewer training parameters to achieve the same expressive power. For instance, using a quantum neural network with efficient data encoding methods can result in networks with significantly fewer training parameters than their classical counterparts [11], because of the significant reduction in the number of qubits used to process your data. This reduction in complexity allows for shorter training times and lower costs. Another example are tensor networks, a quantum-inspired model that can optimally compress neural networks, leading to efficient reduction in training parameters, GPU usage, and time for both training and inference [12].
  3. Smoother Training and Fewer Chances of Overfitting: Variational quantum algorithms and quantum neural networks can offer smoother convergence of the loss function compared to classical models due to the reduced complexity of the resulting model. This often leads to better generalization capacity on the test set [13].
  4. Explainability: Some quantum machine learning algorithms enhance the explainability of solutions. This the case for Bayesian networks. Inference in large classical Bayesian networks is computationally intensive, often requiring approximation methods for practical use. The complexity of exact inference scales exponentially with the number of nodes in the worst case. Quantum Bayesian networks [14], on the other hand, can perform inference more efficiently by exploiting quantum parallelism, providing direct access to joint and marginal probability distributions and a deeper understanding of causal relationships between nodes.
  5. Efficiency: Quantum computing excels in solving optimization problems like graph partitions (Maxcut) or combinatorial problems (QUBO). Many machine learning problems can be formulated as combinatorial optimization problems. For example, unsupervised image segmentation can be transformed into a graph partitioning problem [15], where each pixel of the image corresponds to a node in the graph, and the edges represent similarities between neighbouring pixels. The optimal segmentation is achieved by finding the optimal partition in the graph that produces more dissimilarity between classes.
  6. Novel Representations of Data: Selecting suitable representations for input data is essential for achieving optimal performance in numerous classification or regression algorithms. Quantum algorithms such as quantum kernel methods [16] can generate novel data representations that are challenging or impossible to create with classical kernels. These methods are used in machine learning algorithms such as SVMs and spectral clustering.
  7. Efficient Sampling: The probabilistic nature of quantum computing allows for the generation of realistic distributions, useful in quantum generative models that leverage quantum mechanics to generate complex data distributions. Quantum generative models often demonstrate better generalization, particularly when data is scarce [17].

The advantage of these QML algorithms is that they require much less overhead from quantum hardware. Many of these algorithms can be run today for small or intermediate-sized problems and can scale as quantum hardware advances. Additionally, most quantum-inspired algorithms already provide significant advantages today, optimized to run on classical hardware such as GPUs while being ready to evolve to full-quantum models as quantum computers scale. This allows for significant benefits now while preparing for future advancements.

Quantum Machine Learning promises to revolutionize machine learning by overcoming current limitations like high computational costs and the curse of dimensionality. Leveraging the unique strengths of quantum computing, QML offers up to exponential speed-ups and novel data representations that enhance accuracy, efficiency, and scalability. While some quantum algorithms require advanced hardware still in development, many quantum-inspired techniques can already be used on classical hardware, offering immediate benefits. As quantum technology advances, QML will pave the way for a more powerful and accessible future in machine learning.

Enroll in Ingenii’s FREE quantum fundamentals course.

We’ve curated a quantum machine learning fundamentals introductory course for anyone to begin their QML journey!

  • Virtual, self-paced lessons
  • Quantum machine learning focus
  • Completely FREE

Subscribe here to be notified when the course is released.

Interested in our wider business journey? Follow us on Instagram!

--

--