Integrating Snips with Home Assistant

With the rise of Amazon Echo and Google Home, we are getting increasingly accustomed to controlling our home via voice. But our home is also our most intimate space, and we have to think twice before installing an always-listening, cloud-connected device streaming our data to some distant server.

On February 28th, an Amazon S3 outage resulted in massive home IoT dysfunction.

Cloud-connected devices have already had disastrous consequences for parents in Germany. And in the US, where more and more devices depend on remote servers to function, we’ve seen potentially dangerous situations arise.

At Snips, we are challenging the idea that adding Artificial Intelligence to our everyday life means giving up Privacy. Some things are just not meant to happen online, especially when it involves things we are speaking. This belief is why we recently launched the Snips Voice Platform, a 100% on-device alternative to Amazon Alexa, allowing anyone to easily add powerful voice assistants to connected devices without compromising their Privacy.

In this article, we will show how to control housing lights by voice using Snips and Home Assistant, without requiring an Internet connection.

Snips integrates nicely with Home Assistant, a popular Open Source platform for home automation. If you have any questions or comments, please join our community on Slack.

The Snips Voice Platform

The Snips Voice Platform features Hotword Detection, Speech Recognition, Natural Language Understanding and Dialog Management.

Similar to Amazon Alexa or, Snips takes voice or text as input, and produces intents as output, which are explicit representations of an intention behind an utterance, and which can subsequently be used by the system to perform appropriate actions. But instead of running the ASR and NLU in the cloud, Snips runs entirely on the Raspberry Pi, with blasting performance!

From utterance to intent, all happening on the Raspberry Pi!

Here is an example of Snips in action:

The Snips Voice Platform in action!

Setting up Snips and Home Assistant


We will be using the following hardware:

  • Raspberry Pi 3 ($39 on Adafruit)
  • SanDisk 16GB Micro SDHC ($10 on Amazon)
  • A standard USB microphone ($15 on Amazon)
  • Philips Hue Starter Kit ($69 on Amazon)

Home Assistant

Setting up a Raspberry Pi with Home Assistant is straightforward. For a detailed tutorial, see the Getting Started Guide.

Philips Hue

In this post, we will use voice commands to trigger Philips Hue lights. They are well integrated into Home Assistant. We simply need to add the following entry to configuration.yaml:

platform: hue

Check the Philips Hue Getting Started Guide for help obtaining the IP address of your Hue bridge.


The Snips Voice Platform is equally straightforward to set up, using a single line of code:

$ curl -sSf | sh

Bridging Snips and Home Assistant

Messages between Snips and Home Assistant are passed via MQTT. We must tell Home Assistant which MQTT broker to use by adding the following entry to configuration.yaml:


As a convenience, Snips can run an MQTT broker on port 9898. So if we wish to use this broker, the entry will look as follows:

port: 9898

If we wish to use an external MQTT broker, we must change these values accordingly. Furthermore, we must tell Snips not to start its own broker but rather use an external one. This is specified in the launch parameters, as explained in the “Running the assistant” section below.

Next step: creating our Snips Assistant!

Creating a Snips Assistant

Step-by-step guide to building an assistant and deploying it on a Raspberry Pi.

We head over to to build our assistant. We choose Raspberry, English and Snips On-Device ASR.

We choose the default setup: Raspberry Pi, English, and Snips On-Device ASR.

Now, we could go on and create intents for light queries manually. This involves adding a few sample phrases such as “Turn on the lights” and tagging them. However, Snips already offers intent bundles, which are collections of intents pre-trained on larger datasets and thoroughly tested. For our purposes, we choose the “IoT” intent bundle, which includes intents to control the lights, but also the heating, television, speakers and more.

The IoT intent bundle packages a few intents to control home IoT devices.

We then click Save, and launch the re-training of the assistant. Once done, we can download the assistant model.

Installing the assistant on the Pi

An file will be downloaded, we copy it to our Pi:

$ scp pi@pi_hostname:/home/pi/

and install it using the snips-install-assistant helper script:

$ ssh pi@pi_hostname
$ sudo snips-install-assistant

Running the assistant

Make sure a microphone is plugged to the Pi and correctly detected. If you are having trouble setting up audio, we have written a guide on Raspberry Pi Audio Configuration. Snips is launched via the snips command:

$ snips

Snips is now ready to take voice commands from the microphone. To trigger the listening, simply say

“Hey Snips”

followed by a command, e.g.

“Turn the lights green”

We should see the transcribed phrase in the logs, as well as a properly parsed intent. Last step is now to have Home Assistant react to these intents.

Optional: specifying an external MQTT broker

As explained earlier, by default Snips runs its own MQTT broker. But we can also tell Snips to use an external broker by specifying this when launching Snips. In this case, instead of running the snips command above (which assumes we are using the internal MQTT broker), we use the full launch command with explicitly specified parameters (replace MQTT_BROKER_IP and MQTT_BROKER_PORT with appropriate values):

$ docker run -t --rm --name snips --log-driver none -v /home/pi/.asoundrc:/root/.asoundrc -v /opt/snips/config:/opt/snips/config --privileged -v /dev/snd:/dev/snd snipsdocker/platform --mqtt MQTT_BROKER_IP:MQTT_BROKER_PORT

For more details on launch options, check the documentation on Snips Platform Commands.

Adding Home Assistant triggers

As mentioned above, the output of Snips is an intent. For instance,

“Turn the kitchen lights blue”

will output the following intent:

"text": "Turn the kitchen lights blue",
"intent": {
"intentName": "ActivateLightColor",
"probability": 0.9521337
"slots": [
"entity": "objectLocation",
"slotName": "objectLocation",
"value": {
"kind": "Custom",
"value": "kitchen"
"entity": "objectColor",
"slotName": "objectColor",
"value": {
"kind": "Custom",
"value": "blue"

We now have to tell Home Assistant how to handle such intents. This is done via actions, defined in configuration.yaml, using the Home Assistant Script Syntax. For instance, to handle the above intents and trigger a light color change in a given room, we add the following entry:

- service: light.turn_on
entity_id: light.{{ objectLocation | replace(" ","_") }}
color_name: {{ objectColor }}

What is happening here is that the Snips custom component will invoke the script as defined in the action field, passing the intent slots as parameters. The templating system used by Home Assistant allows us to perform more complex actions involving conditionals, loops, states, and more. For a detailed explanation, see the Templating Guide.

That’s it! We now have a voice-controlled Home Assistant setup running entirely on-device! A sample configuration file can be found on Github. We invite you to contribute and make it better! Comments and suggestions are welcome, just ping us on Slack!

If you enjoyed this article, it would really help if you hit recommend below :)

Follow us on Twitter @michaelfester and @snips

If you want to work on AI + Privacy, check our jobs page!