RNN Numerical Walkthrough + Code
Intro to Recurrent Neural Networks along with code from scratch
Table of Contents
- Overview
- Model
- Forward Propagation
- Backpropagation Through Time (BPTT)
- Weight Updates
- Truncated Backpropagation Through Time (TBPTT)
- Conclusion
Overview
In this article, we take a look at the mathematical calculations behind a recurrent neural network. We’ll explore the outputs of the forward pass and the gradients of the backward pass. If you’re unfamiliar with forward and backward propagation in a vanilla neural network, take a look at this article: Step-by-Step Neural Network Calculations of Forward and Backpropagation
In contrast to feed forward networks, RNNs are able to capture temporal dependencies (dependencies through time) through a feedback loop that feeds its output in as input. Recurrent neural networks derive their name from the term recurrent which means occurring often or repeatedly because the same task is performed repeatedly for each input.
Some popular uses of recurrent neural networks arespeech recognition, time series prediction, NLP (machine…