Important Natural Language Processing Frameworks you should Know (NLP Infographic)

Mohd Sanad Zaki Rizvi
Analytics Vidhya
Published in
2 min readAug 28, 2019

Have you heard about the latest Natural Language Processing framework that was released recently?

I don’t blame you if you’re still catching up with the superb StanfordNLP library or the PyTorch-Transformers framework!

There has been a remarkable rise in the amount of research and breakthroughs happening in NLP in the last couple of years.

I can trace this recent rise to one (seismic) paper — “Attention is All You Need” by Google AI in June 2017.

This breakthrough has spawned so many new and exciting NLP libraries that enable us to work with text in ways that were previously limited to our imagination (or Hollywood).

Here is the interest in natural language processing according to Google searches in the last 5 years in the US:

We can see a similar pattern when we expand the search to include the entire globe!

Today, we have State-of-the-Art approaches for Language Modeling, Transfer Learning and many other important and advanced NLP tasks. Most of these involve the application of deep learning, especially the Transformer architecture that was introduced in the above paper.

So we decided to collate all the important developments in one place and in one neat timeline.

I have listed a few tutorials below to help you get started with these frameworks:

Without any further ado, here is the infographic in all its glory! And if you want to download the high-resolution PDF (which you really should), head over here.

NLP Latest Breakthroughs

Originally published at https://www.analyticsvidhya.com on August 28, 2019.

--

--