LSTM vs GRU: Experimental Comparison

Eric Muccino
Mar 6 · 3 min read

A Recurrent Neural Network is a type of Artificial Neural Network that contains shared neuron layers between its inputs through time. This allows us to model temporal data such as video sequences, weather patterns or stock prices. There are many ways to design a recurrent cell, which controls the flow of information from one time-step to another. A recurrent cell can be designed to provide a functioning memory for the neural network. Two of the most popular recurrent cell designs are the Long Short-Term Memory cell (LSTM) and the Gated Recurrent Unit cell (GRU).

In this post, we will take a brief look at the design of these cells, then run a simple experiment to compare their performance on a toy data set. I recommend visiting Colah’s blog for a more in depth look at the inner-working of the LSTM and GRU cells.

Long Short-term Memory

source: Colah’s blog

Gated Recurrent Unit

source: Colah’s blog

Experimental Comparison

Blue: LSTM, Orange: GRU

Conclusion and Future Work

Mindboard

Case Studies, Insights, and Discussions of our Modernization Efforts

Eric Muccino

Written by

Mindboard

Mindboard

Case Studies, Insights, and Discussions of our Modernization Efforts