Experimenting with Robots and Cloud IoT Core

Gus Class
9 min readJul 11, 2018

--

For Google I/O 2018, I delivered a presentation with Gabe Weiss covering Cloud IoT Core. You can watch the full presentation here:

For the presentation, my responsibility was to both show the basic usage of the product and demonstrate a more substantial but still educational example of the product. I also wanted to produce something that would be engaging and fun for the audience of the talk so I started brainstorming. Because I had just recently released the Arduino library for Cloud IoT Core and have extensive experience building “fun” projects with Arduino, I opted for using this platform as the basis for the demo.

My first idea was to produce an LED light, similar to the ones mentioned in this blog post. In fact, the mirrored enclosures in that post were prototypes that I was going to control with Cloud IoT Core and the Cube is fully controllable via the Internet and Cloud IoT Core. My partner, Gabe, had the clever idea of using the light box as a point of curiosity during the presentation which probably would have worked out well.

Having already delivered LED demos with Android Things in 2017 as part of the Google Developer Days, I opted to take an entirely different direction for this demo and instead wanted to make a robot that could be configured using Cloud IoT Core. In the previously linked I/O presentation, you can see the robots get demoed at around 41 minutes. This post will cover the journey to producing the demo and will highlight key portions of the code used for controlling the robots.

Selecting a Robot

Just a small robot army

Before I ended up with the Hexspider toy that is popular with hackers, I pored over my existing robots from previous projects. Flying robots, maker-friendly robots, toy robots, and inverse kinematic robots are spread out all over my apartment so this would be an exercise in narrowing down the list to a robot that could be easily controlled and that could be built quickly enough to be demoable.

I started with the robot platform from Adafruit that came with the Adabox 002. The robot uses a 32u4 chip with BLE and was built around the Feather form factor. This was great because I could, in theory, replace the existing “brain” with an IoT-connected one that would work with our Arduino library. After some trial and error, I found it difficult to drive the H-bridge that connected the robot to its motors from the ESP8266 feather and moved on.

Next, I tried tinkering with the DFRobot Vortex robot that I got from a kickstarter campaign a few years ago. The approach I was going to take would be to either use the BLE interface or find an I2C / SPI bus on the robot that would accept input from an ESP. Again, this proved too difficult to do on the short schedule I had and I realized quickly that I could not procure additional robots in time to build a compelling demo.

Finally, I was digging through my desk at work and noticed another toy robot from a kickstarter campaign, the Bots_alive project, that is built around the Hexspider toy. The kickstarter shipped with a IR blaster that would emulate the controller of the toy and I was excited because I’d built IR blasters around the ESP8266 before and found out the robots could easily be picked up inexpensively online.

With the help of the Arduino Hexbug spider library and using the IRemote’s IRrecv as a drop-in replacement for its IR code, I was able to quickly build a minimal prototype that allowed IR control of the toys.

Robot Code Overview

For simplicity, I have broken the functions of the robot into separate header files. Although it’s not the prettiest way to separate out functionality, this approach is common with Arduino projects because the IDE will show the headers cleanly within the IDE.

Functionality broken out by header in the Arduino IDE

I’ve broken the functionality down to:

  • hextspider_esp8266.ino
    The “main” code including the Arduino setup() and loop() functions
  • backoff.h
    Implements truncated exponential backoff with jitter to avoid spamming the backend in error conditions
  • blinky.h
    LED driver code for the LEDs
  • ciotc_config.h
    Configuration for the Cloud IoT Core project and robot
  • cli.h
    Command-line interface that is useful when testing
  • esp8266_wifi.h
    ESP8266-specific code for WIFI setup and network communication
  • hexbug_spider.h
    Code for sending IR signals to the robots
  • ranger.h
    Code for reading distances using an HC-SR04 sonar ranger

When the program starts, Arduino calls the setup() function. When this happens, the ultrasonic ranger is configured, the LEDs are used to indicate the current status of the boot procedure, and the robot connects to WiFi to access Cloud IoT Core.

After setup is called, the main loop begins and the code continuously reads its configuration from Cloud IoT Core using the HTTP bridge.

To de-duplicate commands, the configuration version is tracked and only increments to the last read configuration are used to reconfigure the robot. Because the configuration messages are transmitted as base64-encoded payloads, we translate directly between the encoded message (1…5) and the configuration for the robot.

When a configuration message is received that indicates a scan should be performed (5 or NQ==) , the robot will perform a scan sweep to get the range to objects surrounding the robot.

Which is a great segue to the second program — a flask web app that can be used to send configuration changes to the robot.

Web app code overview

The web app is a concise Flask web server that uses sends and receives data from the robots. When it’s running, the app displays a few buttons for manually moving the robots as well as a radar chart of the distances read from the ultrasonic ranger.

The first key thing that the app does is transmit configuration changes to the robots. It does this using the Google Client library for Python. The following snippet shows how this is done:

Specifically, at line 17 in the previous snippet, you can see the call to the client library for updating the configuration for the robot. The data payload will be a value between 1 and 5 indicating the current configuration transmitted to the robot.

The next aspect of the web app is to receive telemetry messages from the robots indicating the ranges of surrounding objects. This is done using the Google Cloud Python client library for Pub Sub. The following snippet highlights how this is done:

When the flask app starts, the callback for handling messages is configured and then when telemetry messages come in, the current message is appended to the messages stored in the flask web app. The messages can be manually reset using the /reset route in the app or will be cleared when the web server is restarted.

Bringing it all together

If you’re interested in running the demo yourself, you will need to build a small circuit around an ESP8266 board and will need to purchase a Hexbug spider. While adding the components to your board, you can test them using the testHardware function in hexspider_esp8266.ino.

Connect the Anode of an IR LED to pin 4 (D4) on the board and connect the Cathode to ground. Connect the ranger so that the trigger pin is on 5 (GPIO5 or D1), and the echo pin is connected to pin 15 (GPIO15 or D8). If you’re interested in using LEDs for sending the blink codes and indicating the phenotype of the robot, connect the data pin to pin 13 (GPIO13 or D7) and the clock pin to 12 (GPIO12 or D6).

The following image shows how the essential components connect to the board.

Circuit diagram showing the sonar sensor and IR led connected to board

If you haven’t already, clone the project or download a zip.

When you first run the program, make sure to set your board type and port to the correct values in the Arduino IDE (e.g. ESP-8266, serial.ws1211). From the Library Manager, install the “Google Cloud IoT Core” library as well as the “Adafruit Dotstar” library used to control the lights. You may need to define the rotation value (e.g. #define HEXBUG_FULL_ROTATION 20) before the inclusion of the hexbug_spider.h header as I haven’t decided whether I want to leave it to be calibrated or provide a value that works for me.

By uncommenting the line //#define HWTEST, you can enable the hardware test mode. After you flash your device, it should trigger motion on a Hexbug spider, log sonar distances to the serial logger, and blink the RGB LED if you connected it.

After you have confirmed all of the hardware is working using the test function, it’s time to create a Cloud IoT Core project and connect the robots to it.

First, you need to create a Pub Sub topic and subscription:

Next, you’ll need to create the IoT Core project:

Next, you’ll create the device keys for the robot:

Register the device:

After registering the device, dump the EC private key for inclusion in ciotc_config.h:

Now update ciotc_config.h with your wifi configuration, device ID, registry name, and so on. Flash your updated code to the robot and then if everything works, the robot will nod its head after it connects to Wifi and receives a configuration from Cloud IoT Core.

At this point, if you update the device configuration in the Google Cloud Console by selecting the device from the registry you created, you will see the device respond. You can also use the serial logger to diagnose if things are not working.

Once your device is working, it’s time to start the flask app.

From the web folder, create a Python virtual environment and then install all of the dependencies:

Copy a service account credential file with access to the Cloud IoT Core registry to creds.json in the web folder. Open up server.py and update the server configuration to match the project settings you’re using with your robots. Open index.html and update the device name from “blinky” to the name you used when registering your device. Now you’re ready to run the flask app:

After the app starts, navigate to localhost:5000 and you will see the web server running. Clicking the arrows should send configuration change messages to the robot. Clicking the swirly icon will command the robot to measure its surroundings. There is scaffolding within the app to control four robots independently.

Observations and conclusions

It was really fun to build and demo the robots! Also, interfacing with the Hexbug spider was easier than I expected. If you wanted to, you could extend this control mechanism to other IR toys and devices. For example, you could make a Cloud-connected remote for your television, could potentially modify a toy using its I2C interface, or could control the “ball UFO” devices over their IR port to make a flying robot.

A few folks pointed out after the talk that this demo is something of an anti-pattern for Cloud IoT Core. In other words, using configuration messages to directly control the robots is less than optimal use of the product and configuration messages in general. In practice, you generally would want more of the intelligence of the robots to exist on-board so that the robots would be able to operate in situations where they lose connectivity. With more sophisticated robots, the configuration could be a navigation point or the result of a decision based on sensor data.

For now, I consider this demo to be a fun hack and a great hands-on way to learn and demo Cloud IoT Core. I hope it inspires you!

--

--