Building a Voice Controlled Home Automation Model — Part 4— Wit.ai

Timi Ajiboye
chunks of code*
Published in
4 min readNov 28, 2016

This is the fourth part in the Building a Voice Controlled Home Automation series. You can find the first part here.

At the end of the third part, I explained the code that will be loaded onto the Arduino. Today, we’ll talk about Natural Language Processing.

Remember, this high level overview from the first part? We’re going to talk about “.4”: Wit.ai NLP.

High Level Overview

Wit.ai is an awesome FREE service that does all the NLP, speech-to-text heavy lifting. I have another series on this blog that details, as much as possible, Wit.ai’s (new) features.

Intents and Entities

Before you begin to use Wit’s API in your code, you have to tell it what to look out for. The kind of intents your user can have and the entities within the user’s request. See an excerpt from the first Wit.ai post.

Intents: These are pretty self explanatory. An intent is simply what the user intends to do. This could be something like changeTemperature or getNews.

Entities: Entities are variables that contain details of the user’s task. With an intent like changeTemperature, you would want to be able to extract the exact temperature the user is trying to change it to from their text/voice input. Wit comes with built-in entity types like location, number, and amount_of_money, plus you can create your own.

What we are going to do is to create Entities and Intents for our home automation android application.

You can check out the complete Wit application here and use it as a reference when you’re creating yours.

In this application, we’re going to create two Intents: door and light. From their names, you can easily infer what they mean. If Wit recognizes a door intent, it means your user is trying to control/perform some action on a door.

Within the door intent, we have two entities:

Door intent
  1. door_descriptor: This is the part of your user’s speech that indicates which door they’re trying to control. i.e: front door, back door, garage door…
  2. open_close: This is the part of your user’s speech that determines what action you’re trying to take on the door. i.e open or close.

Similarly, within the light intent, we have two entities:

Light intent
  1. light_descriptor: This is the part of your user’s speech that indicates which light they’re trying to control. i.e: living room lights, dining lights, garage lights…
  2. on_off: This is the part of your user’s speech that determines what action you’re trying to take on the door. i.e on or off. This is actually an entity that wit already has, so you can just choose it (as opposed to creating your own).

Wit.ai Dashboard Walkthrough

Finally, I’m going to walk you through creating intents and entities on the (new) Wit dashboard.

It’s a bit different from the one I used to create the application I shared above, so you can’t exactly copy it.

1. Create a new application

Click on the + sign on the top right when you sign into Wit.

Create new Wit app

Then fill out this form and click “Create App”

New Wit app form

2. Create entities

In the new Wit dashboard, you create an “intent” the same way you’ll create other entities.

  • Click on the “Understanding” tab, then type out a sentence you think you want your app to understand. For example; “turn on living room lights”.
  • Click on “add a new entity” and type out “intent”, then set the value of that intent to “light”.
  • Now, you can select/highlight bits of your text and map them to entities. For example; highlight “living room”. Click on “create an entity for living room”, then type the entity name “light_descriptor”.
  • You can do the same thing for the “on_off” entity.

What’s next

In the next (and probably final) article in this series, we shall be creating an Android app that sends our voice commands to Wit for interpreting and then to the Arduino for actuation.

--

--

Timi Ajiboye
chunks of code*

I make stuff, mostly things that work on computers. Building Gandalf.