DL : Basic Concept of RNN

Part 5.1 of Deep Learning Specialization

Pisit J.
Sum up As A Service
3 min readApr 5, 2020

--

Taming LSTMs: Variable-sized mini-batches and why PyTorch is good ...

1. Why RNN, not NN ?

1.1 RNN cell

1.2 Cell vs Layer

1 Cell, 1 Layer

Sequential input into the same RNN cell.

N Cell, N-Layer >> Deep RNN

2. Bidirectional RNN

3. RNN — GRU — LSTM

3.1 RNN

3.2 LSTM

4. Word Representation

- One-hot Encoding

- Word Embedding

- Word2Vec

4.1 One-hot Encoding

represent words by Sparse Matrix of all unique words

4.2 Word Embedding

represent words by matrix of selected features

4.3 Word2Vec

learn word embedding using Neural Network

(Encoder — Decoder)

5. Attention Model

learn where (when) to focus according to input

Reference

Deep Learning Specialization: Convolutional Neural Network (Coursera) (Youtube)

--

--