Dr. Bugga
Evolutionary Machines
1 min readMay 9, 2017

--

Facebook Artificial Intelligence Research (FAIR) published a paper on using CNNs approach for language translation that is 9 times faster than translation using RNNs.

“A distinguishing component of our architecture is multi-hop attention. An attention mechanism is similar to the way a person would break down a sentence when translating it: Instead of looking at the sentence only once and then writing down the full translation without looking back, the network takes repeated “glimpses” at the sentence to choose which words it will translate next, much like a human occasionally looks back at specific keywords when writing down a translation” — FAIR

--

--