Member-only story
Batch, Mini Batch and Stochastic gradient descent
Optimizer : It is nothing but an algorithm or methods used to change the attributes of the neural networks such as weights and learning rate in order to reduce the losses.
Content:
- What is Optimizer?
- Batch Gradient descent
- Mini Batch Gradient descent
- What is batch size?
- Stochastic Gradient descent
- Comparison
If you don’t have good understanding on gradient descent, I would highly recommend you to visit this link first Gradient Descent explained in simple way, and then continue here. ☺
What is Optimizer?
Optimizer is nothing but an algorithm or methods used to change the attributes of the neural networks such as weights and learning rate in order to reduce the losses.
In other words, how we should change the weights and learning rates of our neural network to reduce the loss is defined by the optimizer we use.
Overall, we can say
- (Predicted output — actual output) is nothing but loss and this loss we pass inside optimizer.

