Important Natural Language Processing Frameworks you should Know (NLP Infographic)
Have you heard about the latest Natural Language Processing framework that was released recently?
I don’t blame you if you’re still catching up with the superb StanfordNLP library or the PyTorch-Transformers framework!
There has been a remarkable rise in the amount of research and breakthroughs happening in NLP in the last couple of years.
I can trace this recent rise to one (seismic) paper — “Attention is All You Need” by Google AI in June 2017.
This breakthrough has spawned so many new and exciting NLP libraries that enable us to work with text in ways that were previously limited to our imagination (or Hollywood).
Here is the interest in natural language processing according to Google searches in the last 5 years in the US:
We can see a similar pattern when we expand the search to include the entire globe!
Today, we have State-of-the-Art approaches for Language Modeling, Transfer Learning and many other important and advanced NLP tasks. Most of these involve the application of deep learning, especially the Transformer architecture that was introduced in the above paper.
So we decided to collate all the important developments in one place and in one neat timeline.
I have listed a few tutorials below to help you get started with these frameworks:
- How do Transformers Work in NLP?
- Tutorial on Text Classification (NLP) using ULMFiT and fastai Library in Python
- Introduction to StanfordNLP: An Incredible State-of-the-Art NLP Library for 53 Languages
- A Comprehensive Guide to Build your own Language Model in Python!
- OpenAI’s GPT-2: A Simple Guide to Build the World’s Most Advanced Text Generator in Python
- Introduction to PyTorch-Transformers: An Incredible Library for State-of-the-Art NLP (with Python code)
- A Comprehensive Course on Natural Language Processing using Python
Without any further ado, here is the infographic in all its glory! And if you want to download the high-resolution PDF (which you really should), head over here.
Originally published at https://www.analyticsvidhya.com on August 28, 2019.