Types of Optimization Algorithms used in Neural Networks and Ways to Optimize Gradient Descent

Anish Singh Walia
15 min readJun 10, 2017

Have you ever wondered which optimization algorithm to use for your Neural network Model to produce slightly better and faster results by updating the Model parameters such as Weights and Bias values? Should we use Gradient Descent or Stochastic gradient Descent or Adam?

I too didn’t know about the major differences between these different types of Optimization Strategies and which one is better over another before writing this article.

--

--

Anish Singh Walia

I love to write - Medium Top Writers(India)-Medium 33,000+ subscribers - 400,000+ views/300,000+ Reads/monthly - Sr. Tech Writer(Developer Relations) @ DO