Generative Modeling in QML — An overview

Anjanakrishnan
3 min readSep 27, 2023

--

Day 27 — Quantum30 Challenge 2.0

Note: Do check out the link for the github page in the ‘Reference’ Section of the article for a better understanding and implementation.

Introduction

Hybrid Computing and AWS

Hybrid computing in quantum computing harnesses both classical computing and quantum properties. It involves classical preprocessing tasks such as data processing, optimization, initialization of quantum states, and more. These are followed by quantum computations, including the implementation of quantum algorithms and solving optimization problems. Finally, classical post-processing is performed to interpret the outcomes. Hybrid jobs leverage the capabilities of both classical computers and quantum processors to effectively solve complex problems.

Amazon Braket provides a platform to perform such hybrid jobs through its AwsQuantumJob module. Let's explore an example in the field of generative modeling within quantum machine learning.

https://miro.medium.com/v2/resize:fit:828/format:webp/1*VyaiU_thaGpOlVgIOTabpw.jpeg

Generative Modeling

Generative modeling entails creating models capable of generating new data samples that resemble a given dataset. Think of it as a computerized artist. These computer programs learn from examples and generate new data that closely resembles what they have learned. It can involve facial recognition or creating realistic-looking images of people who may not even exist in the real world — welcome to the AI age!

Improvement of facial recognition over time using generative models. Source — https://www.oreilly.com/api/v2/epubs/9781098134174/files/assets/gdl2_0103.png

The Maximum Mean Discrepancy (MMD) serves as a statistical measure that quantifies the difference between two probability distributions. In simpler terms, it helps determine how similar or dissimilar two sets of data points are.

Quantum Circuit Born Machine

Now, let’s introduce the quantum aspect. Enter the Quantum Circuit Born Machine (QCBM). This quantum generative model harnesses quantum circuits to efficiently sample from high-dimensional probability distributions — those with numerous possible outcomes or variables involved. In QCBM, we design a quantum circuit comprising a series of quantum gates that manipulate the quantum states of qubits. These circuits are capable of representing complex probability distributions, with the quantum state of the qubits encoding these probability distributions.

Steps for implementation

The implementation of QCBM on Amazon Braket can be broken down into a series of steps:

Problem Setup:

  • Clearly define the problem you want to solve using the QCBM. This may involve specifying the target probability distribution you want to approximate or the generative modeling task you aim to accomplish.
  • Quantum Resources Allocation: Determine the number of qubits and quantum resources required for your problem. This step sets the foundation for configuring the quantum circuit architecture

Generation of Data:

  • Generate synthetic data or collect a dataset relevant to your problem. Ensure the data format is compatible with the QCBM’s input requirements.
  • Data Preprocessing (Classical): Perform any necessary classical data preprocessing tasks, such as scaling, encoding, or feature extraction, to prepare the data for training.

Data Upload:

  • The data is passed to the Braket Hybrid Job, specifying the data’s location and format.

Setting up Hyperparameters:

  • These parameters are set before the training process begins. They control various aspects of the model’s behavior and performance during training, playing a critical role in fine-tuning the model for optimal performance.
  • Define the hyperparameters that will govern the QCBM’s behavior during training. These may include the number of layers, learning rates, convergence criteria, and regularization strengths.

Quantum Circuit Design:

  • Create the quantum circuit architecture for the QCBM. This involves designing a sequence of quantum gates and qubits, tailored to your problem and hyperparameters.

Training Setup:

  • Configure the training process, specifying the optimization algorithm (e.g., gradient descent) and the convergence criteria (e.g., maximum number of iterations or desired loss threshold).

Creating a Braket Job: To create a Hybrid Job:

  • Specify the device (e.g., Amazon Braket SV1 simulator).
  • Provide the directory containing the code to run (source_module).
  • Define the entry point (main function).
  • Optionally, assign a unique job name.
  • Specify hyperparameters and input data.
  • Choose to wait until the job completes or run it asynchronously. Upon completion, you can obtain results, including the optimized QCBM parameters.

Finally, we monitor the loss function during training and may explore a grid search to find optimal hyperparameters for the task.

Reference

QuantumComputingIndia

--

--