Natural Language Processing and the Art of Asking Questions

Questioning is one of the weakest points in the current generation of digital assistants(DA) and natural language processing( NLP) technologies. Popular conversational products such as Alexa, Siri or Cortana are perfectly able to engage in fairly rich reactive conversations but they are still very limited when comes to determining and asking questions.

Recently, I had the opportunity to review a research paper published by Microsoft Maluuba about machine learning techniques that can be used of question generation. If you are not familiar with Microsoft Maluuba, it is a team mostly composed of artificial intelligence(AI) researches that Microsoft acquired at the beginning of the year.

Maluuba’s paper promotes the idea of using reinforcement learning models to automate question generation. The techniques proposed in the paper are focused on optimizing the accuracy and grammatical structure of questions. The nature of reinforcement learning guarantees that the performance of the algorithms will improve as they are being used. Reinforcement learning techniques are focused on maximizing a utility function based on the result of specific actions (read some of the articles about reinforcement learning in this blog). Maluuba’s algorithms leverage reinforcement learning to generate questions that can be answered based on the content of a specific text.

The efforts by Microsoft Maluuba’s team are still in very early stages but the ideas showed a lot of promise (at least from the theoretical standpoint). The techniques proposed on the paper, or a variation of them, can result on the initial steps to implement effective question generation models for NLP stacks and DAs.

Five Tangible Benefits of Question Generation n NLP Stacks

From ad AI and NLP standpoint, question generation is an incredibly difficult task. As a result, most NLP stacks have managed to delay the implementation of question generation capabilities despite its well-known benefits. Let’s look at some of the immediate advantages from leveraging effective questioning in conversational interfaces:

1 — Gathering Contextual Information

Question generation is an effective technique to gather contextual information on a conversation in order to determine which actions should be taken. Think about a sales executive requesting a list of accounts from a DA. Upon receiving the request, the DA can ask additional questions about the nature of accounts in order to execute the correct query against the CRM system.

2 — Proactive Conversational Interactions

Most conversational interactions between users and DAs are reactive in nature. Users ask questions and DAs provide answers. Effective question generation algorithms can open the door to a new type of interactions in which DAs can start a conversation with a user by asking a question.

3 — Questions Lead to Better Answers

From a cognitive standpoint, the ability to generate questions will allow NLP and DA stacks to provide more accurate answers. This is simply based on the fact that, from the linguistic standpoint, there are many questions that lead to the same answers. Understanding a diverse pool of questions related to a specific fact, will help DAs and NLP solutions to provide more effective answers.

4 — Uncertainty-to-Certainty Transitions

Questions can make DAs to operate more efficiently in contexts of uncertainly. By asking the right questions, a DA can help to transition a conversation from an uncertain context to a certain one.

5 — Classification & Self-Training

Question generation is an effective cognitive mechanism to clarify a specific intention. Similarly, DAs and NLP stacks can use questions to clarify users intended actions and gather information that can be used as input to self-training NLP models.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.