DL Tutorial 11 — Understanding Long Short-Term Memory Networks
Learn how long short-term memory networks are used for sequential data processing.
Table of Contents
1. Introduction
2. What are Recurrent Neural Networks?
3. The Problem of Long-Term Dependencies
4. How Long Short-Term Memory Networks Work
5. Applications of Long Short-Term Memory Networks
6. Conclusion
Subscribe for FREE to get your 42 pages e-book: Data Science | The Comprehensive Handbook
Support FREE Tutorials and a Mental Health Startup.
1. Introduction
In this tutorial, you will learn about long short-term memory networks, or LSTMs for short. LSTMs are a type of recurrent neural network (RNN) that can process sequential data, such as text, speech, or video. LSTMs are especially useful for dealing with the problem of long-term dependencies, which is when the output of a neural network depends on inputs that occurred many time steps ago.
By the end of this tutorial, you will be able to:
- Explain what are recurrent neural networks and how they work