Gradient Descent

Mohor B
1 min readAug 30, 2023

--

In today’s lesson, I revised a learning strategy which is used very commonly throughout the machine learning domain, called gradient descent. It was also enlightening to see how it can be extended to the quantum domain.

Our objective in most machine learning problems is to reduce the loss/cost function computed for a set of parameters so that the data can be modelled better by the machine learning model. Gradient descent is a technique that can help us converge to a local minimum of the cost function.

A smart way to minimize a function is to move along the steepest gradient (of descent) of the function by updating its parameters in that direction. The learning rate, eta, is a small positive hyperparameter that decides the size of the updates. This process when conducted iteratively leads us to hit the local minimum of the graph.

This process can be used in quantum machine learning as well. There are two kinds that can be used- Finite Difference Gradients and Analytic Gradients.

Reference-: (Website from which I read -: Training parameterized quantum circuits)

Day 30 of Learning-: #Quantum30 by QuantumComputingIndia

--

--