Day 47: 60 days of Data Science and Machine Learning Series

RNN and LSTM with a project…

Naina Chaturvedi
Coders Mojo

--

Pic credits : lm

Recurrent Neural Network, created in the 1980’s, is a state of the art algorithm for dealing with sequential data by using internal memory to remember important things about the input RNN’s received to precisely predict what’s coming next. RNN’s are popularly used in language translation, natural language processing (nlp), speech recognition, captioning etc.

RNN vs Feed Forward Network ( Pic credits : IBM)
How RNN works ( Pic credits : Research Gate)

Long Short Term Memory networks (LSTM) introduced by Hochreiter & Schmidhuber are special type of Recurrent Neural Networks ( RNN) designed to avoid the long-term dependency problem and can selectively remember patterns for long duration of time.

Bidirectional LSTM ( Pic credits : Datax)

Some of the other best Series —

30 Days of Natural Language Processing ( NLP) Series

--

--

Naina Chaturvedi
Coders Mojo

🇺🇸,World Traveler, Sr. SDE, Researcher Cornell Uni, Women in Tech, Coursera Instructor ML & GCP, Trekker, IITB,Reader,I write for fun@AI & Python publications