Universal Backpropagation in 15 lines of code

Ray Born
Ray Born
May 24 · 1 min read

Backpropagation is the core of today’s deep learning applications. If you’ve dabbled with deep learning there’s a good chance you’re aware of the concept.

If you’ve ever implemented backpropagation manually, you’re probably very grateful that deep learning libraries automatically do it for you. Implementing backprop by hand is arduous, yet the concept behind general backpropagation is very simple.

Did you know that the eager execution-restricted backprop is almost trivial? You can implement it in 15 lines of code. For each output, you just need to track it’s inputs, and the gradient of the operation (what I call ‘jacob’).

See my repo. It is tested on MNIST.

To be continued…

Ray Born

Written by

Ray Born

Undergrad at WSU. In my free time I enjoy dogs, running, and video games.