Build Your First Action with Interactive Canvas

Yüksel Tolun
Google Developer Experts
9 min readJul 1, 2019

If you’ve ever developed an Action for the Google Assistant, you might have thought “This platform has serious potential for gaming”, only to realize that you’re very restricted about what you can actually show to the users.

Enter, Interactive Canvas!

What is the Interactive Canvas?

Interactive Canvas, which was announced at Google I/O ’19, is a framework that allows your Actions to serve rich visuals that can be interacted with voice. Now, I personally don’t think that games must have graphics, I believe that voice-only games can be just as fun as the graphics-heavy ones. But Interactive Canvas is still a game-changer as it allows you to build voice forward experiences by taking advantage of the web technologies that you already know and love. It currently supports smart displays and Android mobile devices.

If you want to make games ready for Google Assistant, keep reading. Becuase in this post, I’ll try to walk you through the steps of building a simple Action that uses Interactive Canvas.

Limitations

Even though Interactive Canvas supports most of the HTML/CSS/JS features that a modern web browser does, there are some limitations due to privacy and security reasons. You can’t use cookies or local storage, for example. You can find a full list of restrictions here.

What we’re going to be building

We will build an Action where we show the user various animal icons on the screen and the user can change the color of each animal by simply telling the name of the animal and which color they want it to turn.

How will it work?

  • The user will start the action and our Default Welcome Intent will trigger the interactive canvas, showing the user a web page with the animal icons
  • The user will pick an animal and a color and say, for example, “make the dog red”
  • As always, Assistant will turn user utterance into text and send it to Dialogflow
  • Dialogflow will match the users’ intent to change_animal_color intent, and extract the entity values as animal = dog and color = red
  • Our fulfillment will update the data(state) of our canvas. This update will be registered by the web page via InteractiveCanvas.js and that’ll trigger a function that changes the color of the animal.

Prerequisites

Please note that even though we won’t make a full-fledged game at the end of this article, you still need to be familiar with the basics of Actions on Google platform and Firebase.

  • Understanding the basics of Actions Of Google platform
  • Familiarity with Dialogflow
  • Node.js and firebase-tools installed on your computer

If you’re lacking one or more of the prerequisites, I suggest completing the Codelabs Building Actions for the Google Assistant (Level 1) and Build Actions for the Google Assistant (Level 2) before following this tutorial.

Building Hello World

Creating the project

  1. First of all, go to Actions On Google Console and create a new project. (Don’t forget to notice how beautiful the new UI is 😍)
  2. Pick Games & Fun category since the Interactive Canvas is only available for Games as of today.
  3. Pick Conversational at the next step as we’re going to be using Dialogflow.
  4. Under the Build Your Action section, choose Add Action(s) and click Add your first action button. Select Custom intent and click Build.
  5. At this point, Dialogflow should open. Log in if you haven’t already and click Create at the top right corner.
  6. Go back to Actions On Google Console and select Deploy from the top menu.
  7. Scroll at the very bottom and check the box with the label Interactive Canvas. Scroll up to the top and click Save.

Voilà! Your project and your Dialogflow agent is ready for development.

If you didn’t pick the right category at this step, you can set it anytime via the dropbox that’s located under Deploy > Directory Information > Additional Information

Setting up the local development environment

In this tutorial, we’ll be using Firebase Cloud Functions for webhook fulfillment and Firebase Hosting to host our web app. If you’ve never used firebase-tools on your computer, first run:

firebase login

If you’re logged in, let’s create a directory on our computer run the following command:

firebase init

Say yes when it asks if you’re ready and select Hosting and Functions features from the list and hit enter.

A new list will pop up, select the project-id of the project you’ve created in the previous steps. Answer the following question in as:

function configuration:

  • JavaScript
  • No ESLint
  • Yes, install the dependencies.

hosting configuration:

  • Use (public) as our public directory. (Just hit enter)
  • Yes, configure this app as a single-page app.

There you have it, a local development environment for our Cloud Function and Hosting, ready to deploy on Firebase.

Configuring the Welcome Intent

Our Default Welcome Intent needs to open our web app in the Assistant. That means we have to configure it so that it’s fulfilled by our webhook. Open Dialogflow console and:

  1. Pick Default Welcome Intent from the list and delete all the training phrases for the intent. (We don’t want our users to come back to this intent by using those phrases)
  2. Go ahead and delete all the responses, too.
  3. Lastly, enable webhook call for the intent by using the button at the bottom of the page. Click Save.

Building the basics: Web App

We need to build and deploy the web app first becuase we’re going to use its URL while building our fulfillment.

Open public > index.html file and change its content with the following code:

Notice the asset that Google provides us with: interactive_canvas.min.js
  • interactive_canvas.min.js is the tool that’s going to register the state updates that we’ll send from our webhook. Basically, this is the file that makes it possible to communicate between our conversational agent and our web app.
  • interactive_canvas.min.js also has a method to get the header height of the user’s device.

Now, run the following command to deploy our hosting:

firebase deploy --only hosting

After it’s done uploading, you should see the Hosting URL in the terminal. You might want to copy that URL as we’re going to use it in the next step.

Building the basics: Fulfillment

We have already generated the necessary files for our project with the firebase-tools, we can now go ahead and edit them.

First, we need to set the dependencies for our webhook fulfillment. Open functions> package.json and change its content with the following code:

As you can see, we’re depending on the “preview” version of the actions-on-google client library to use Interactive Canvas.

Then run the following command in the functions directory to install the dependencies:

npm install

When it’s done installing, open functions > index.js and change its content with the following code:

⚠️ Don’t forget to replace YOUR_URL_HERE with your web apps URL ⚠️

You need to return a new HtmlResponse object to use Interactive Canvas. You need to write your web app’s URL, which you’ve copied in the last step, where it says YOUR_URL_HERE.

After you’ve updated the URL, run the following command to deploy your cloud functions:

firebase deploy --only functions

Now in your terminal, you should see the Function URL. Copy that and go to Dialogflow Console. On the left sidebar, click on Fulfillment, enable Webhook and paste your function URL.

Test your Action

Go to Actions On Google Console and open the Simulator by clicking Test on the top menu. When you say “Talk to my test app” you should see Hello, World! written on the display and also hear “hello world”.

How exciting 👀

It’s cool to see our custom web page inside the Assistant but we still can’t interact with it using our voice. Let’s move on to the next section to see how we can do it.

Taking it a step further

Let’s put the animal icons on the screen and let users change their colors based using their voice.

Expanding the Dialogflow Agent

In the finished project, we’ll have 4 animals changing colors. This means that we need to handle two entities in the Dialogflow: animal and color. Lucky for us, Dialogflow has a system entity for color so we don’t have to define it. But we do need to create a new entity for the animals we’ll show to the user. Open Dialogflow Console and click Entities on the left sidebar. Create a new entity named animal as such:

I added crow as a synonym for the bird because the icon we’ll be using is technically a crow 🐦

Once you’re done, save it.

We also need a new intent to change animal color. Let’s go ahead and create it. Select Intents from the left sidebar and Create a new intent named change_animal_color:

Make sure the entities are extracted properly 🔍

You can copy the following training phrases:

  • turn frog into blue
  • make bird yellow
  • frog green
  • change the color of the dog into red
  • paint the dragon into pink

Scroll down to the bottom of the page, Enable Fulfillment and turn on Enable webhook call for this intent.

When you’re done, go ahead and save it.

Create fulfillment for the new intent

Open functions > index.js and change its content with the following code:

⚠️ Don’t forget to replace YOUR_URL_HERE with your web apps URL ⚠️

As you can see, the new intent handler returns another HtmlResponse object. But this time, we don’t need to set the URL again. Instead, we pass a data object with our parameters to update the state of our web app.

Expanding the web app

Finally, we’re going to make our web app display the animals and change their color when the change_animal_color intent updates the data(state). For that, open the public > index.html file and change its content with the following code:

Free SVG icons are courtesy of FontAwesome licensed under CC BY 4.0 License — More info at http://fontawesome.com/

Did you notice that we’ve included two new files? main.css and main.js Let’s create those:

public > main.css

Some basic styling to make everything look alright

public > main.js

Notice how we use the getHeaderHeightPx() method of the interactiveCanvas object

As you can see, the way our state updates are registered by the web app is actually quite simple.

Every time an HtmlResponse is sent from our Webhook, onUpdate function will be called with the new data (state). When it does, we’re going to select the corresponding animal’s SVG element and change its fill color to the color that’s stored in our updated state.

Now that we’re done, deploy both your fulfillment and your hosting:

firebase deploy

Test your Action

Go ahead and open simulator and say “Talk to my test app”. Once you see the icons on the screen, feel free to try changing the animal colors:

You can also use smart display devices or Android phones to test your Action.

Well done, you’ve created your first Action with the Interactive Canvas 👏

Learn more

💻 You can download the sample code for this tutorial on Github.

If you’d like to learn more about Interactive Canvas, you can visit the official documentation or you can check out Build Actions with Interactive Canvas for the Google Assistant Codelab which was written by Yoichiro Tanaka who is also a Google Assistant GDE.

Closing remarks

Obviously, building a game is so much more than just reacting to user input with auditory and visual responses. There are still many aspects of voice gaming that we’ll need to figure out but it’s nice to finally have a way to show our users some beautiful graphics if we want to.

The best part of Interactive Canvas is that it takes advantage of the web technologies many of us already love and use. This also means that what you can show on the display basically has no limits. That‘s why ever since the Interactive Canvas was announced, I’ve been experimenting with JS libraries that allow me to create animations or visual effects on the web.

Which JS libraries or tools do you think would pair well with Interactive Canvas?

Big thanks to Allen Firstenberg and Mehmet Karaköse for their feedback on this article. Don’t forget to 👏 if you liked it. If you’re as excited as I am about the future of voice gaming, you should follow me here on Medium and also on Twitter.

--

--

Yüksel Tolun
Google Developer Experts

lawyer, coder, and sitcom lover. currently obsessed with talking devices, data protection laws, and YouTube