QNN with Qiskit — The fundamental modules

Anjanakrishnan
2 min readSep 22, 2023

--

Day 22 — Quantum30 Challenge 2.0

Classical Neural Networks draw inspiration from the human brain and can be trained to recognize patterns. Quantum Neural Networks (QNNs), on the other hand, amalgamate quantum computing with classical machine learning, forming an intersection between these two realms.

https://spectrum.ieee.org/media-library/abstract-representation-of-a-neural-network-which-is-made-of-photons-and-has-memory-capability-potentially-related-to-artificial.jpg?id=29676155&width=1200&height=900

To delve deeper into the world of QNNs, Qiskit offers its ‘qiskit-machine-learning’ module. Let’s familiarize ourselves with some fundamental modules and terms.

Qiskit provides two main QNN implementations:

EstimatorQNN module:

  • Think of it as a quantum model that estimates or predicts something, such as an outcome or a value. It takes input data and quantum circuit parameters (weights) as input.
  • It’s used when you want to calculate the expectation value of a quantum observable (like measuring an observable in a quantum experiment).

SamplerQNN module:

  • Imagine it as a quantum model that samples or generates data. It takes input data and quantum circuit parameters (weights) as input.
  • It’s used when you want to get samples from measuring a quantum circuit (like running a quantum experiment and getting results).

An important concept of Neural Network Training is the Forward and Backward Pass

Forward Pass:

  • It is like a prediction step. Here you take your input data and pass it through your neural network mode to make predictions.
  • The network performs calculations using its weights to produce an output or prediction which is then compared to the actual target to measure how well the model is performing.
  • So the primary goal of Forward Pass is to compute the model’s output.

Backward Pass:

  • It corresponds to the learning step. After making the predictions (in forward pass), you compare these predictions to the actual target values to calculate the error.
  • The backward pass then walks backword through the network to adjust the model’s weights so that the error is minimized.
  • It computes gradients (indicates the adjustment of parameters to reduce errors leading to optimization).
  • So, the primary goal of the Backward Pass is to update the model’s parameters to improve predictive accuracy in future forward passes.

Now that we have understood some baisc terms, the following chapter by Qiskit will show you all the fundamental steps one needs to follow when working with QNN in Qiskit, regardless of the specific algorithm or application.

QuantumComputingIndia

--

--