Gradient Descent vs Stochastic Gradient Descent vs Batch Gradient Descent vs Mini-batch Gradient Descent

Amy @GrabNGoInfo
GrabNGoInfo
Published in
4 min readDec 16, 2022

--

Data science interview questions and answers

Gradient Descent vs. Stochastic Gradient Descent vs. Batch Gradient Descent vs. Mini-batch Gradient Descent Data science interview questions and answers
Photo by Milad Fakurian on Unsplash

Gradient descent is a commonly asked concept in data science and machine learning interviews. Some example interview questions are

  • What is gradient descent?
  • What are the pros and cons of stochastic gradient descent?
  • What are the differences between batch gradient descent and mini-batch gradient descent?

In this tutorial, we will answer these questions by comparing gradient descent, stochastic gradient descent, batch gradient descent, and mini-batch gradient descent.

Resources for this post:

Let’s get started!

Gradient Descent

Gradient descent is an optimization algorithm used to find the minimum of a function. It works by iteratively moving in the direction that reduces the value of the function the most. Gradient descent is a common algorithm used in machine learning to find the optimal…

--

--