Building Apps for Google Assistant

Mayank Mahajan
Xebia Engineering Blog
6 min readJan 9, 2020

We all know how voice assistants are influencing our lives in a number of ways. They have become a part of our daily routine right from Morning 6 AM alarms to some nice peaceful music that can bring you to sleep. We all hear about some really cool voice assistants like Amazon Alexa, Apple Siri, and Google Assistant. They really make our work easy, don’t they?

How do voice assistants work?
The things that a voice assistant does or performs is a result of the action that is invoked when you say a particular phrase. For example: when you say “OK Google”, your assistant knows that you are communicating with it and it is ready to listen to your commands.

Everything you need to make apps for your assistant
1. Actions on Google (
https://console.actions.google.com )
- Extends the capabilities of google assistant
- Connects with new users where ever google assistant is available
- Helps user get the work done with text or voice commands
- Innovate with a conversational interface

2. Dialogflow ( https://dialogflow.cloud.google.com )
- Traditionally known as api.ai
- It uses artificial intelligence and machine learning to understand what the user is trying to say
- We can send a response to the user’s request directly from dialogflow

How Google Assistant works

Conversation Flow

The diagram above clearly shows how user interaction with the google assistant takes place.

  • The user requests for the fulfillment by invoking an action
  • Once the action is invoked, dialogflow will try to process it to generate the relevant output
  • The generated output is now sent back to the user in the form of a response to the conversation

Understanding Entities and Intents

Entities:

  • These are objects that serve to enumerate different things people will talk about in their natural conversations
  • Entities are the values that we are trying to capture from the user phrases
  • This could be something like… “ Knowledge ”, “ Training ”
  • We use entities as slot filling parameters to get a business value
  • Types of Entities:
    * Pre Built System entities: time, number, address
    * Developer entities: custom entities

Intent:

  • An intent is triggered by a series of user said phrases
  • This could be something like…. “ Please tell me something about xke “
  • The entry point into the application
  • Maps what the user says to what conversational experience says or do

Let’s start building some fun stuff

  1. Go to the actions console
  2. Click on New Project and give your app a name
  3. On the next screen, select a suitable category for your app
  4. Go to the Invocation tab on the left-hand panel and now let us define how you would like your app to be invoked.
Invoking your application

5. Under the Invocation tab, click on the Actions tab to define your first action. This will take you to the dialogflow window where you would define the action and the response to the action. By default, the welcome Intent is created for you. You would just need to define the response to the invocation to that action.

Welcome Intent

In the above screenshot, when the user tries to connect to your app using phrases like “ Talk to Xebia Blog ”, your app would send a response from the list of responses that you have defined. For example: In the above screenshot, when the user tries to connect to my app, my assistant would respond with “Welcome to Xebia Blog! How can I help you?”

Congratulations! You just created your first intent and the response. You can now create multiple other intents or even add a follow-up intent to initiate the flow of the conversation.

Please refer to the below image as an example. This shows how you could book an appointment by just using simple actions.

Follow-up Intent

With reference to the above example, we have created 2 entities:

  • Color
  • Length

and 4 intents:

  • Welcome
  • Order
  • Product
  • Appointment

Configure fulfillment

Webhooks

Your assistant supports the use of webhooks. Your web service will receive a POST request from Dialogflow in the form of the response to a user query matched by intents with webhook enabled.

Simply hook in your assistant app to other apps like Github to invoke an action via third-party apps

Inline Editors

The power of the google assistant is not just limited to some specific tasks. For tech-savvy developers, we can extend the power of google assistant by writing some inline code.

// See https://github.com/dialogflow/dialogflow-fulfillment-nodejs
// for Dialogflow fulfillment library docs, samples, and to report issues
'use strict';

const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Card, Suggestion} = require('dialogflow-fulfillment');

process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements

exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));

function welcome(agent) {
agent.add(`Welcome to my agent!`);
}

function fallback(agent) {
agent.add(`I didn't understand`);
agent.add(`I'm sorry, can you try again?`);
}
// // Uncomment and edit to make your own intent handler
// // uncomment `intentMap.set('your intent name here', yourFunctionHandler);`
// // below to get this function to be run when a Dialogflow intent is matched
function yourFunctionHandler(agent) {
agent.add(`This message is from Dialogflow's Cloud Functions for Firebase editor!`);
agent.add(new Card({
title: `Xebia Store`,
imageUrl: 'https://dataxday.fr/wp-content/uploads/2018/01/xebia-logo-data.jpg',
text: `Welcome to Xebia`,
buttonText: 'This is a button',
buttonUrl: 'https://assistant.google.com/'
})
);
agent.add(new Suggestion(`Quick Reply`));
agent.add(new Suggestion(`Suggestion`));
agent.setContext({ name: 'weather', lifespan: 2, parameters: { city: 'Rome' }});
}
// // Uncomment and edit to make your own Google Assistant intent handler
// // uncomment `intentMap.set('your intent name here', googleAssistantHandler);`
// // below to get this function to be run when a Dialogflow intent is matched
function googleAssistantHandler(agent) {
let conv = agent.conv(); // Get Actions on Google library conv instance
conv.ask('Hello from the Actions on Google client library!'); // Use Actions on Google library
agent.add(conv); // Add Actions on Google library responses to your agent's response
}
// // See https://github.com/dialogflow/dialogflow-fulfillment-nodejs/tree/master/samples/actions-on-google
// // for a complete Dialogflow fulfillment library Actions on Google client library v2 integration sample
// Run the proper function handler based on the matched Dialogflow intent name
let intentMap = new Map();
intentMap.set('Default Welcome Intent', welcome);
intentMap.set('Default Fallback Intent', fallback);
// intentMap.set('your intent name here', yourFunctionHandler);
// intentMap.set('your intent name here', googleAssistantHandler);
agent.handleRequest(intentMap);
});

This tutorial should give you some basic insights on how you can leverage the power of google assistants and create some simple yet powerful apps.

--

--