PyTorch Transformers for state-of-the-art NLP

Hugging Face open sources a new library that contains up to 27 pretrained models to conduct state-of-the-art NLP/NLU tasks.

elvis
DAIR.AI
2 min readJul 17, 2019

--

PyTorch Transformers 1.0

Hugging Face, the NLP startup behind several social AI apps and open source libraries such as PyTorch BERT, just released a new python library called PyTorch Transformers.

Transformers are a new set of techniques used to train highly performing and efficient models for performing natural language processing (NLP) and natural language understanding (NLU) tasks such as questions answering and sentiment analysis. Several of the recent techniques used to improve and advance the performance of NLP models, such as XLNet and BERT, are all based on a variation of Transformer.

The first release of PyTorch Transformers makes it easy to use state-of-the-art pretrained models for natural language processing (NLP) based on the Transformer technique. Now it’s possible to top leverage models such as Google BERT, OpenAI GPT-2, Transformer XL, Facebook’s XLM, and XLNet. In fact, according to Thomas Wolf, Lead Scientist at Hugging Face, there are a total of 27 pretrained models available in the PyTorch Transformers library.

This is an exciting release as researchers and engineers now have access to a unified API that makes it easy to leverage the latest techniques coming from the field of NLP. In addition to the simplified and flexible API, the library comes with several scripts to quickly experiment with state-of-the-art models for GLUE, SQuAD, and text generation, to name a few.

And finally, here are the release notes and link to get started with PyTorch Transformers. Excellent work put together by the Hugging Face team and others. I am releasing a follow up to this blog post in a few days showing how to easily make use of PyTorch Transformers for performing interesting NLP tasks such as emotion recognition and sentiment analysis. Keep an eye out for that! 🙏

--

--