Word Embeddings: CBOW and Skip Gram
Since the advent of transformers, NLP gained a lot of traction, and a wide variety of tasks are already solved by GPT-3 and other big transformers-based models. But today we are going to take a step back and learn about word embeddings. In this blog, we are primarily going to look into CBOW or Continuous Bag of Words and Skip Grams. These embeddings are super important for the conversion of text into numbers. So, without further ado, let’s dive into the basics of NLP.