Awesome NLP — 18 High-Quality Resources for studying NLP
Tutorials, code examples, video courses, course notes, and articles
Published in
4 min readJan 14, 2022
This article contains a collection of high-quality resources for the study of Natural Language Processing (NLP). Is intended for those who want to approach the world of NLP with already some machine learning basics, or for those who already know a bit about NLP but want to deepen their knowledge.
Articles
- Modern Deep Learning Techniques Applied to Natural Language Processing: this is an overview of trends in deep learning-based Natural Language Processing, last updated in 2019. It covers the theoretical descriptions and implementation details behind deep learning models, such as recurrent neural networks, convolutional neural networks, and reinforcement learning, used to solve various NLP tasks and applications.
- Visual guides to NLP concepts: a blog from Amit Chaudhary about data science with a focus on NLP. There are 19 posts about NLP which are both detailed and very well explained.
- Jay Alammar blog: Jay’s blog is rich in high-quality NLP posts such as The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning), The Illustrated Transformer, and How GPT3 Works — Visualizations and Animations.
- How to solve 90% of NLP problems: a step-by-step guide: this blog post explains how to build Machine Learning solutions to solve NLP problems, according to the experience of many practitioners in the field. It starts with “the simplest method” that could work, and then move on to more nuanced solutions, such as feature engineering, word vectors, and deep learning.
Tutorials
- Deep Learning for NLP with PyTorch: this tutorial will walk you through the key ideas of deep learning programming using Pytorch. It focuses specifically on NLP for people who have never written code in any deep learning framework, but it assumes a working knowledge of core NLP tasks.
Code examples
- NLP Quickbook: this is intended for practitioners to quickly read, skim, select what is useful and then proceed. There are several notebooks divided into logical themes: text processing, text classification, Text cleaning, spell correction, linguistics, text representations, deep learning for NLP, and chatbots.
- The Super Duper NLP Repo: a collection of Colab notebooks covering a wide array of NLP task implementations. It contains 300+ notebooks.
Video courses
- A Code-First Introduction to Natural Language Processing (fast.ai): this course covers a blend of traditional NLP topics and recent neural network approaches. It also deals with urgent ethical issues, such as bias and disinformation.
- Accelerated Natural Language Processing (Machine Learning University): lectures go from introduction to NLP and text processing to recurrent neural networks and transformers.
- Coursera Natural Language Processing specialization (DeepLearning.ai): it is basically NLP from A to Z, in a four-month-long online course. The last topics covered are encoder-decoder, attention to perform advanced machine translation, text summarization, question-answering, chatbots, T5, BERT, transformers, reformer, and Hugging Face models.
Course notes
- Deep Natural Language Processing lectures (Oxford): lectures series from Oxford. This is an applied course focussing on recent advances in analyzing and generating speech and text using recurrent neural networks.
- NLP notes by Dr. Jacob Eisenstein (GeorgiaTech): this course gives an overview of modern data-driven techniques for natural language processing. The course moves from shallow bag-of-words models to richer structural representations of how words interact to create meaning.
- From Languages to Information (Stanford): this course covers the basics of text processing, sentiment analysis, information retrieval, chatbots, and more. This course is suggested for people new to programming or who are just starting with NLP.
- Natural Language Processing with Deep Learning (Stanford): this course is an introduction to cutting-edge research in Deep Learning for NLP. It covers word embeddings, neural networks with PyTorch, transformers, question answering, text generation, and so on.
- Applied Natural Language Processing (Berkeley): this course examines the use of natural language processing as a set of methods for exploring and reasoning about text as data, focusing especially on the applied side of NLP: using existing NLP methods and libraries in Python in new and creative ways.
- Natural Language Processing (CMU): this course is about a variety of ways to represent human languages, and how to exploit those representations to build models to perform translation, summarization, extracting information, question answering, natural language interfaces to databases, conversational agents, and so on.
Repositories with more content
- NLP progress: a repository that tracks the progress in Natural Language Processing, including the datasets and the current state-of-the-art for the most common NLP tasks.
- Awesome NLP repo: a GitHub repository containing a curated list of resources dedicated to Natural Language Processing.