Milestone: BERT Boosts Google Search

Synced
SyncedReview
Published in
3 min readOct 25, 2019

Google built its brand on Search, and the tech giant has not forgotten that. In what the company calls “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search,” Google today announced that it has leveraged its pretrained language model BERT to dramatically improve the understanding of search queries.

The next time when you search in Google you won’t need to worry about speaking or typing each word precisely to get the results you’re looking for, thanks to BERT (Bidirectional Encoder Representations from Transformers). BERT is a neural network-based technique for natural language processing (NLP) pretraining introduced and open-sourced by Google last year.

When applied to ranking and featured snippets in search, BERT models can process words in relation to all other words in a sentence rather than considering them one-by-one and in order. This enables a better “understanding” of context, which is particularly helpful when it comes to longer, more conversational queries, or searches where prepositions strongly affect meaning.

Google says its tests show BERT integration helps Search better understand one in ten searches when it comes to ranking results. On the company’s official blog Google VP of Search Pandu Nayak shared a few examples of how BERT can help provide more relevant results for queries.

Although BERT integration in Google Search is currently only available for English queries in the US, Google says it is planning to apply BERT to additional languages and locations. Because BERT enables systems to transfer what they’ve learned from one language to others, the rest of the world probably won’t have to wait too long for BERT to work its wizardry on searches in their tongues.

In addition to the advancements in software, Google announced they have also updated their hardware. They are now using their latest Cloud TPUs to serve Search results, which will improve both speed and accuracy.

While the performance improvements are impressive, Google acknowledges that natural language understanding remains an ongoing challenge. “Even with BERT, we don’t always get it right,” Nayak wrote. If you search for “What state is south of Nebraska,” BERT’s best guess is a community called ‘South Nebraska,’ and you are right, that is not in Kansas.

Journalist: Yuan Yuan | Editor: Michael Sarazen

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global