RNN Numerical Walkthrough + Code

Intro to Recurrent Neural Networks along with code from scratch

Kat He
The Startup

--

Table of Contents

  1. Overview
  2. Model
  3. Forward Propagation
  4. Backpropagation Through Time (BPTT)
  5. Weight Updates
  6. Truncated Backpropagation Through Time (TBPTT)
  7. Conclusion

Overview

In this article, we take a look at the mathematical calculations behind a recurrent neural network. We’ll explore the outputs of the forward pass and the gradients of the backward pass. If you’re unfamiliar with forward and backward propagation in a vanilla neural network, take a look at this article: Step-by-Step Neural Network Calculations of Forward and Backpropagation

In contrast to feed forward networks, RNNs are able to capture temporal dependencies (dependencies through time) through a feedback loop that feeds its output in as input. Recurrent neural networks derive their name from the term recurrent which means occurring often or repeatedly because the same task is performed repeatedly for each input.

Some popular uses of recurrent neural networks arespeech recognition, time series prediction, NLP (machine…

--

--

Kat He
The Startup

Software engineer who enjoys learning about anything new, whether that be about machine learning, swe, or miscellaneous facts about the universe.