Building a Google Assistant App using Dialogflow

Chethan N
Myntra Engineering
Published in
4 min readMar 27, 2018

Recently we built the Myntra app for Actions on Google (Google Assistant). It is a very simple and intuitive app with minimal options to get current trends and create looks. The idea was to quickly experiment what all are the things possible and build an app quickly for a simple use case.

With Google investing heavily on conversational interfaces like assistant and google home, I guess it will take off in a big way in coming years. Perhaps, it will change the way we interact with our phones or any other smart device. The speech recognition capabilities are improving at a rapid phase with improvements in ML. More languages are being added making assistant more useful and localised.

DialogFlow is the default choice for building an Actions on Google app. It is free, has a very user friendly interface, has a good documentation, one click integration with Google assistant. The tool is very flexible and supports machine learning. The same app built for Google Assistant can be used with other conversational platforms easily. Can’t ask for more!

In order to build an app for your use case of DialogFlow, you need to follow the following simply steps.

Define intents, one can specify different templates (think of this as a sentence with some annotations, means some words are of special interest to us). It is recommended to use multiple templates for an intent. The intent gets invoked when any of the templates match the user input.

Defining invocation phrases for your intent. Coloured texts are resolved entities for a show product intent

An intent can have input and output context with a lifespan and intent will get resolved if all input contexts specified are met. You can read more about it here. Output context will be carried forward as input to the subsequent intents until they expire (life span zero) or we set lifespan to zero.

Each intent has a unique action (action name is user defined) and can have many parameters. The action and parameters are used by fulfilment service(explained below) to return back appropriate response.

action(“show.item”) and paramets for the above explained intent

Define entities, entities are the special things required for your use cases. You need to define all the possible values an entity can take. These entities can be used to annotate the templates given in your intents. An example for an entity can be gender and possible values are male, female. You can define synonyms for each like men, man etc for male. Dialogflow also has many predefined entities like color number.

A gender entity defined on Dialogflow

The next step is to build a fulfilment service that can respond to user queries. You can build and manage fulfillment service directly in Dialogflow via Cloud Functions for Firebase. Alternatively, you can build a webhook endpoint which will receive POST requests from Dialogflow. Dialogflow matches user queries with intents defined and the resolved intent and parameters are used to send appropriate request to webhook endpoint. An appropriate response can be generated based on the intent type and entity values matched. You can host your webhook service either as a Google cloud project or host it on your own (which gives you more flexibility).

We built our own webhook end point to serve the response as it gives us more flexibility. We use our internal services to generate rich response and send back to Dialogflow. Dialogflow sends this response to Google Assistant and the response is shown to the user.

Actions on Google npm package needs to be used to build the webhook endpoint which will respond to Dialogflow.

Thats it, you can now choose integrations on Dialogflow and choose Google Assistant. All the conversations happening on Google assistant will be received by Dialogflow which get resolved to appropriate intents (a fallback intent if none match) and Dialogflow will make a call to fulfilment endpoint with resolved action name and parameters captured from the conversation. Appropriate response can be constructed and sent back to Dialogflow which in turn will give the response to Google assistant. The response can be a simple text or a rich response like a card or a carousel.

--

--