Hugging Face Implements SOTA Transformer Architectures for PyTorch and TensorFlow 2.0

Synced
SyncedReview
Published in
3 min readOct 4, 2019

NLP-focused startup Hugging Face recently released a major update to their popular “PyTorch Transformers” library which establishes compatibility between PyTorch and TensorFlow 2.0, enabling users to easily move from one framework to another during the life of a model for training and evaluation purposes. With the update, Hugging Face has renamed the library to simply “Transformers.”

The Transformers GitHub project is designed for everyone from weekend hobbyists to NLP professionals. It remains as easy to use as the previous version while now also being compatible with deep learning library Keras. The Transformers package contains over 30 pretrained models and 100 languages, along with eight major architectures for natural language understanding (NLU) and natural language generation (NLG):

The Transformers library no longer requires PyTorch to load models, is capable of training SOTA models in only three lines of code, and can pre-process a dataset with less than 10 lines of code. Sharing trained models also lowers computation costs and carbon emissions.

The standout feature of this update is the interoperability between PyTorch and TensorFlow 2.0. TensorFlow is designed to be production ready, while PyTorch is easier to learn and use for building prototypes. In the previous PyTorch Transformers library these two frameworks were incompatible and there was no way for users to transform a prototype built by PyTorch to a production line built by TensorFlow. Now, it is possible to select appropriate frameworks for different phases of a given language model.

The Transformers library has received more than 14k stars on GitHub and garnered considerable attention on Reddit’s machine learning channel.

Founded in 2016, Hugging Face is based in New York and completed a US$4 million seed round in May 2018. Their latest paper DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter is on arXiv, and has been accepted by NeurIPS 2019.

The major Transformers changes are described here. Detailed installation instructions are available on GitHub.

Author: Reina Qi Wan | Editor: Michael Sarazen; Tony Peng

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global