Sitemap
Geek Culture

A new tech publication by Start it up (https://medium.com/swlh).

Follow publication

Review — Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation (Deep-ED & Deep-Att)

6 min readNov 23, 2021

--

@ Medium)

Outline

1. F-F connections

1.1. F-F connections in RNN

RNN models. The recurrent use of a hidden

1.2. F-F connections in Bidirectional LSTM

2. Deep-ED and Deep-Att: Network Architecture

Deep-ED and Deep-Att: Network Architecture

2.1. Encoder

2.2. Interface

2.3. Decoder

2.4. Other Details

3. Experimental Results

3.1. English-to-French

English-to-French task
Effect of F-F Connections
Different LSTM layer width in Deep-Att
Effect of the interleaved bi-directional encoder
Deep-Att with different model depths
Encoders with different number of columns and LSTM layer width

3.2. English-to-German

English-to-German task

Reference

Natural Language Processing (NLP)

My Other Previous Paper Readings

--

--

Sik-Ho Tsang
Sik-Ho Tsang

Written by Sik-Ho Tsang

PhD, Researcher. I share what I learn. :) Linktree: https://linktr.ee/shtsang for Twitter, LinkedIn, etc.

No responses yet