Google Releases ALBERT V2 & Chinese-Language Models

Synced
Synced
Jan 3, 2020 · 3 min read
Image for post
Image for post

When it was introduced in September 2019, Google’s ALBERT language model achieved SOTA results on popular natural language understanding (NLU) benchmarks like GLUE, RACE, and SQuAD 2.0. Google has now released a major V2 ALBERT update and open-sourced Chinese ALBERT models.

ALBERT — as the full name “A Lite BERT” suggests — is a trimmed-down version of the company’s BERT (Bidirectional Encoder Representations from Transformers) language representation model which has become a mainstay for NLU research. The paper ALBERT: A Lite BERT for Self-supervised Learning of Language Representations has been accepted at ICLR 2020, which will be held this April in the Ethiopian capital Addis Ababa.

As outlined in the Synced report Google’s ALBERT Is a Leaner BERT; Achieves SOTA on 3 NLP Benchmarks, an ALBERT configuration similar to BERT-large has 18x fewer parameters and can be trained about 1.7x faster.

Image for post
Image for post

Major changes in the ALBERT v2 models involve three novel strategies: no dropout, additional training data and long training time. Researchers trained the ALBERT-base for 10M steps and the other models for 3M steps. The results show ALBERT v2 performance generally has a significantly improvement over the first version.

Exceptionally, ALBERT-xxlarge v2 performance is slightly worse than the first version. The researchers identify two probable causes for this: 1. Training an additional 1.5 M steps did not lead to significant performance improvement; 2. For v1, researchers did some hyperparameter search among the parameters sets while for v2 they adopted the parameters from v1 but fine-tuned the RACE test hyperparameters. “Given that the downstream tasks are sensitive to the fine-tuning hyperparameters, we should be careful about so called slight improvements.”

Google has also released Chinese-language ALBERT models built using training data from the Language Understanding Evaluation benchmark for Chinese (CLUE).

The paper ALBERT: A Lite BERT for Self-supervised Learning of Language Representations is on arXiv. The ALBERT models v2 GitHub page is here.

Author: Yuqing Li | Editor: Michael Sarazen

We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Image for post
Image for post

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Image for post
Image for post

SyncedReview

We produce professional, authoritative, and…

Synced

Written by

Synced

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global

SyncedReview

We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.

Synced

Written by

Synced

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global

SyncedReview

We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store