NLP/word embedding/Evolution

A Complete Guide To Understand Evolution of Word to Vector

Walkthrough of word embedding from Bag of words, Word2vec, Glove, BERT, and more in NLP

Co-Learner
Co-Learning Lounge
Published in
17 min readJun 19, 2021

--

Written by Indrajit Singh & Yogesh Kothiya

Credits

What is Word embedding?

Word embeddings are a type of word representation that allows words with similar meanings to have a similar representation.

A word is characterized by the company it keeps — J.R.Firth (1957)

Why do we need Word embedding for all the NLP tasks?

All the NLP applications we build today have a single purpose and that is to make the computers understand human language but the biggest challenge to do that makes the machines understand how we understand human language in the form of reading, writing, or speaking.

To start with we first train our machine learning or deep learning algorithms to understand textual data. As machines do not understand the text we need to make the input to a machine-readable format.

For example, Imagine I’m trying to describe my dog —

--

--

Co-Learner
Co-Learning Lounge

We write the best piece of technical content for you.