Types of Optimization Algorithms used in Neural Networks and Ways to Optimize Gradient Descent

Anish Singh Walia
15 min readJun 10, 2017

Have you ever wondered which optimization algorithm to use for your Neural network Model to produce slightly better and faster results by updating the Model parameters such as Weights and Bias values? Should we use Gradient Descent or Stochastic gradient Descent or Adam?

I too didn’t know about the major differences between these different types of Optimization Strategies and which one is better over another before writing this article.

NOTE:

Bonus Tip: One great tool I recently started using for writing and tasks such as plagiarism checker, grammar checker, Co-writer, paraphraser, summariser, and translator is QuillBot .

I wanted to try something similar and cheaper than Grammarly.

I took up its yearly premium for around $2/month (45% off) during the Year-end sale using coupon code — (HOLIDAY45), valid till December end. The price was literally dirt cheap compared to other writing tools I have used in the past.

Personally, it’s UI and UX is very simple and easy to use. So I just wanted to share this awesome, productive tool with you all. Do check it out and use it in your day-to-day writing tasks.

https://try.quillbot.com/

Best Productivity Writing tool for this month

--

--

Anish Singh Walia

AI Educator | ChatGPT & Prompt Engineering Expert | Medium Top Writers(India) | 2M+ monthly views | Let's connect on LinkedIn-https://shorturl.at/jXHhx