Sitemap
Data Science Collective

Advice, insights, and ideas from the Medium data science community

Batch, Mini Batch and Stochastic gradient descent

8 min readAug 26, 2020

--

Press enter or click to view image in full size

Optimizer : It is nothing but an algorithm or methods used to change the attributes of the neural networks such as weights and learning rate in order to reduce the losses.

Content:

  1. What is Optimizer?
  2. Batch Gradient descent
  3. Mini Batch Gradient descent
  4. What is batch size?
  5. Stochastic Gradient descent
  6. Comparison

If you don’t have good understanding on gradient descent, I would highly recommend you to visit this link first Gradient Descent explained in simple way, and then continue here. ☺

What is Optimizer?

Optimizer is nothing but an algorithm or methods used to change the attributes of the neural networks such as weights and learning rate in order to reduce the losses.

In other words, how we should change the weights and learning rates of our neural network to reduce the loss is defined by the optimizer we use.

Overall, we can say

  • (Predicted output — actual output) is nothing but loss and this loss we pass inside optimizer.

--

--

Data Science Collective
Data Science Collective

Published in Data Science Collective

Advice, insights, and ideas from the Medium data science community

Sweta
Sweta

Written by Sweta

Data Science | Deep learning | Machine learning | Python

Responses (1)