Glossary of Deep Learning: Word Embedding

Jaron Collis
Apr 19, 2017 · 5 min read
Image for post
Image for post
A plot of word embeddings in English and German. The semantic equivalence of words has been inferred by their context, so similar meanings are co-located. This is because the relative semantics of words are consistent, whatever the language. [Source: Luong et al]

Introducing Word2Vec

Image for post
Image for post
The word2vec architecture consists of a hidden layer and an output layer. [Source]
Image for post
Image for post
So, for the word that’s the 4th entry in the vocabulary, its vector is (10,12,19).

Image for post
Image for post
Word Embeddings are similarities based on context, which might be gender, tense, geography or something else entirely. The lines shown are just mathematical vectors, so see how you could move ‘across’ in embedding space from “Man” to “Queen” by subtracting “King” and adding “Woman”.



Deeper Learning

Learning and Applying Artificial Intelligence

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store