MLearning.ai
Published in

MLearning.ai

Two Steps Forward, Two Steps Back

Repeat Until Convergence (Neural Networks and Back Propagation)

In this post, I’ll start with a high-level review of what we’ve learned so far with neural networks and how they work up through a complete forward pass, and then conceptually walk through the back propagation technique to use gradient descent and adjust the randomized weight and bias values to align predictions more closely to actual labels. We will uncover some really neat math effects of using the ReLU activation function, and find out how the chain rule is applied to make finding the gradients across all…

--

--

--

Data Scientists must think like an artist when finding a solution when creating a piece of code. ⚪️ Artists enjoy working on interesting problems, even if there is no obvious answer ⚪️ linktr.ee/mlearning 🔵 Follow to join our 18K+ Unique DAILY Readers 🟠

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jason Eden

Jason Eden

Data Science & Cloud nerd with a passion for making complex topics easier to understand. All writings and associated errors are my own doing, not work-related.

More from Medium

What the difference between Data science, Machine learning & Artificial intelligence

Types of Machine Learning Algorithm You Should Know

Starbucks Capstone

Find Confusion Matrix , Precision, Recall,F1-Score and Accuracy without SKlearn