Amazon Alexa AI’s ‘Language Model Is All You Need’ Explores NLU as QA

Synced
SyncedReview
Published in
3 min readNov 9, 2020

New research from Amazon Alexa AI posits that current natural language understanding (NLU) approaches are far from how humans understand language, and asks whether all NLU problems could be efficiently and effectively mapped to question-answering (QA) problems using transfer learning.

Transfer learning is an ML approach for applying knowledge learned from a source domain to a target domain. It has produced promising results in natural language processing (NLP), particularly when transferring learning from high data domains to low data domains. The Amazon researchers focus on a specific type of transfer learning, where the target domain is first mapped to the source domain.

NLU is taken as determining intent and slot or entity value in natural language utterances. The proposed “QANLU” approach builds slot and intent detection questions and answers based on NLU annotated data. QA models are first trained on QA corpora then fine-tuned on questions and answers created from the NLU annotated data. Through transfer learning, this contextual question-answering knowledge is then used for finding intents or slot values in text inputs.

Unlike previous approaches, QANLU focuses on low resource applications and does not require the design and training of new model architectures or extensive data preprocessing. This enables it to achieve strong results in slot and intent detection with an order of magnitude less data.

The researchers conducted experiments on the ATIS and Restaurants-8k datasets, with QANLU in low data regimes and few-shot settings significantly outperforming sentence classification and token tagging approaches for intent and slot detection tasks, while also bettering the new IC/SF few-shot approach’s performance in NLU.

The researchers say future directions could include expanding beyond this configuration and across different NLP problems, measuring the transfer of knowledge across different NLP tasks, and studying how QANLU questions might be generated automatically based on context.

The paper Language Model Is All You Need: Natural Language Understanding as Question Answering is on arXiv.

Analyst: Yuqing Li | Editor: Michael Sarazen

Synced Report | A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors

This report offers a look at how China has leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon Kindle. Along with this report, we also introduced a database covering additional 1428 artificial intelligence solutions from 12 pandemic scenarios.

Click here to find more reports from us.

We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global