Machine Learning is Fun Part 5: Language Translation with Deep Learning and the Magic of Sequences
Adam Geitgey
3.8K53

Hi Adam,

Thanks for the AMAZING series of articles.

A question regarding the seq-to-seq flow described in the article:

Since each sequence is trained separately to create the encoding machine, and since the encoding itself doesn’t have any meaning outside the specific flow, how can we be sure that the encoding → decoding flow will generate the desired outcome?

For example:

How can we make sure that the encoding of the English sentence and the Spanish one will generate the same encoding vector?

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.