LED Jacket Couture Powered By Google Home

Patrick Gunderson
6 min readMar 20, 2017

--

We set out to apply some recent advances in IoT which enable us to control interactive devices in our homes through the power of speech.

Overview

The Oracle of North America is an application of selected emerging technologies that, together, form an integrated and fun system for Google Home that can reach through the cloud and affect change in the real world through a bespoke wearable, LED laden jacket.

What is The Oracle?

The Oracle of North America acts as a pathway of confirmation and discovery. Ask it a question and it will respond. Ask the color of your Aura, and it will show you.

Technologies

  • Google assistant action
  • API.ai
  • Cloud-based Web Application
  • Embedded Computing
  • Microcontroller
  • LED lighting

The Oracle

Google’s assistant

There are two main interface points for users. On one end is the Aura Jacket and the other the Google Assistant. Google Home is a device that uses the Google Assistant to translate speech commands into action.

Setting up the Google Assistant, was a learning experience. We found a number of friction points and requirements that increased our anticipated level of effort.

  • Lack of support for private actions means that our action had to be suitable for public consumption, and controlling a one-off piece of hardware isn’t good enough to pass the approval process. So, we added new functionality that would make the action more generally useful.
  • SSL is required to interface with Google Assistant so we had to get a domain and certificate. It’s a good idea for production apps, but complicated our demo.

Setting up the AI

The Google Assistant sends speech to API.ai. API.ai takes the speech commands as transcribed through GA and uses a set of rules to determine what to do with the text.

The API.ai agent system is fairly straightforward and easy to use. We set up a number of “intents” that define what our bot will respond to, both vocally and physically using the jacket as a display output.

Connecting those intents to Google Assistant added a layer of complexity. Thankfully the team at Tool is accustomed to setting up projects and servers through Google Cloud, which made the process easier. We set up an API server of our own using Google Compute Engine running on node.js. The API service communicates with Google Assistant over a RESTful interface. The same server also hosts a real-time socket connection for our embedded computer that runs the jacket, forwarding appropriate commands as they are received from Google Assistant.

Application Servers

Once the command received by Google Assistant has been interpreted through the API.ai intents system, API.ai sends a message to our always-available server which decides how to handle the command. For some commands, The Oracle sends back a response that the Google Assistant reads the user, but when you ask The Oracle what color your aura is, our server signals the Aura Jacket to

Of the many options for running a server, we chose Google Compute Engine. We love and often use Compute Engine’s linux-based cloud servers on many of our projects, so keeping everything within the same integrated (and documented) ecosystem was a pretty easy decision.

Aura Jacket

The Aura Jacket is what takes The Oracle beyond a typical GA action. In addition to interacting in virtual space, The Oracle can control things in real space, remotely and wirelessly.

When The Oracle receives a request to read someone’s aura, it determines the color of the aura, then sends a command to the Aura Jacket to display the color in addition to responding verbally.

Embedded software

There were two main pieces of embedded software, one to communicate with the outside world and one to drive the LEDs. Driving the LEDs themselves was pretty easy, as we built on top of the excellent FastLED library. Connecting to the outside world took a bit more work.

Control Panel

During initial development, we built a local control interface for the Aura Jacket using what would traditionally be called web technologies. Those technologies, though, have grown to be powerful off the web as well. We use Node.js on the PI to serve a webpage with a touch-based control panel which can access all of the functionality in the Aura Jacket.

The control panel gave us the ability to test and control the LEDs separately from integrating with the cloud. It also gave us the ability to control the jacket when there is no internet available, speeding up development, and increasing reliability.

Embedded Hardware

To run The Aura Jacket itself, we explored a number of embedded computing solutions paying special attention to power requirements, performance potential, in-built capabilities and price.

Generalist or Specialist?

One of our early goals was to find and use a single-board platform that could run the Aura Jacket on its own, but after evaluating a number of contenders we found that a solution combining multiple boards would be more stable and robust.

Our hybrid solution, uses a Raspberry PI 3 to handle business logic and WIFI connectivity, paired with a Teensy 3.2 microcontroller to drive the LEDs. The PI and the Teensy communicate over a serial connection using a custom, minimal protocol, loosely based on MIDI.

Like our Compute Engine servers we chose to use a general purpose linux machine. Choosing linux gave us the flexibility to build on top of excellent pre-existing tools, and make smooth adjustments as we inevitably ran into snags.

LEDs

We chose to go with strips of WS2812 (neopixels) due to their broad use and availability. They aren’t the easiest chips to work with, requiring a very precise 800khz clock to control, but because they are so widely used, there are numerous libraries available to ease development. We tried driving the LEDs directly from each of the linux-based computers, but because linux isn’t a real-time operating system, ran into problems, and opted for a dedicated external LED controller (Teensy).

The Teensy controller is theoretically capable of driving over 2000 LEDs (using a separate power supply) directly from the GPIO, which is more than enough for our purposes so we opted out of using the excellent OctoWS2811 expansion, designed specifically for driving large numbers of LEDs.

Power

Powering the wearable setup was another challenge with a number of potential options. We could go with off-the-shelf battery packs meant for running drones, or charging cell-phones, or build a custom solution using other lithium batteries or standard AAs.

The jacket has the potential to pull around 3 amps of current, which is a lot in a wearable. We mitigated this draw by turning down the brightness on the LED strips about half way (cutting our total power draw down by about 30%).

At first we looked at using 3 packs of 4xAA NiMH battery cells. Using multiple packs has the benefits of being able to quickly swap out batteries, and to be able to distribute power throughout the wearable without long, fragile cable runs. We ran into some trouble with the battery holders and wiring had to be modified.

In the end, we again deferred to an off-the-shelf solution and chose a large monolithic battery pack with standard USB connectors paired with high-powered micro usb cables.

Cabling

Connecting everything together was another area where we learned a lot. Running long cables to connect the sensors and LED strips with high-frequency communication created a good amount of static charge and interfered with communication. We ended up switching away from custom built cables to shielded USB cables. Even though we’re not using the USB protocol, our requirements are very similar, and the cables are ubiquitous

Conclusion

In the end, the Oracle is a project marked by learning and compromise (as are all projects). we finished with a functional prototype that bridges the gap between virtual-space and real-space, taking voice commands from the Google Home and responding via a custom device.

Appendix

Sorry this is an image, Medium doesn’t do inline tables.

Our chosen stack isn’t the most power efficient, but the trade-off for ease of development and timeliness, it fit the bill, mostly at the expense of battery life.

[1] http://www.home-automation-community.com/arduino-low-power-how-to-run-atmega328p-for-a-year-on-coin-cell-battery/

[2] https://www.adafruit.com/product/2471

[3] https://learn.sparkfun.com/tutorials/esp32-thing-hookup-guide

[4] https://www.arduino-board.com/boards/teensy31

[5] https://www.scivision.co/measured-power-consumption-of-intel-edison/

[6] https://wiki.onion.io/Documentation/Hardware/Omega-Hardware

[7] https://www.pidramble.com/wiki/benchmarks/power-consumption

--

--