Making my own Spot Mini

Alexander Watson
8 min readFeb 6, 2019

--

This blog is capturing my journey to combine openly available hardware, and add some mechanical design, 3D printing, and software to create an affordable and open-sourced robot with Boston Dynamics’ Spot-Mini like capabilities.

Why?

Boston Robotics projects like Atlas, Big Dog and Spot Mini are super cool. They can dance, open doors, and (semi)autonomously navigate through rooms. The potential applications for a quadruped robot that could help you around your house or outdoors are incredible. I’d love to buy one personally, but apparently you need to be Jeff Bezos to have one of your own.

What we know about Spot Mini

So… this is the beginning of an experiment to see if I can build something just like the Spot Mini. Below is a recent video of a SpotMini in action. Initial observations are that it’s obviously a quadruped, with two axes of movement at the shoulders, and a single axis at the elbow. It’s interesting that it doesn’t have any range of movement in its feet. It has what looks like an array of four (depth?) cameras on each side, and from zooming in super close to the pictures above a whopping 8 cameras for sensing in the front. The movement is super lifelike and with none of the joltiness of the servo-driven robots that available from Amazon.com/etc. Step 1 is going to be finding a robot with similar capabilities that I can mod for similar movement.

From other BostonDynamics videos it’s more clear that SpotMini is given high level movement direction from a human operator, and not fully autonomous. Also, from videos of the operators kicking or pulling at the SpotMini (which I’m sure the human race will pay for some day), we can tell that each of the limbs and the body have some level of autonomy and are not simply following pre-programmed movements.

Great- time to start building. The most lifelike and affordable robot that I’ve been able to find is Reach Robotics Mekamon- an excellent robot built for Augmented Reality gaming and available at the Apple Store for $249. Pros: Each of the 4 Mekamon legs has 3 degrees of movement and is motor driven, similar to the Spot Mini, just arranged more like an insect than the Spot Mini’s dog-like stance. Two motors in the shoulders provide two axes of movement, and a motor in the elbow provides a single axis of movement, but perhaps with less range of motion. Cons: there are no APIs yet for the Mekamon robot, so you can only control it from the Mekamon App. We’ll see about that...

First step—An API for Mekamon

The first step in giving our Mekamon robot Boston Dynamics-like abilities is in getting control of the robot’s motion programmatically, so we can, you know, do things. There aren’t any public APIs published, so the only way to interface with it currently is through the Mekamon app. I need to be able to control it via computer and keyboard initially. There are a few ways I can think of to go about hacking in to create my own API to control the robot- which I’m arranging from easiest to most complex.

How to Hack a Robot

  1. Capture the Bluetooth Low Energy packets between my iPhone’s Mekamon app and the robot. Reverse engineer the command protocol, and build my own API that emulates a connected phone. Pros: Super clean interface to control robot. Cons: Most limited interface as we can only use functionality that already exists in the manufacturer’s APIs, and likely limiting in the future. Also, I’ve seen some weird stuff in non-documented APIs.
  2. Disassemble the Mekamon and see if I can find any USB, debug, or serial connectors that I can use to dump the firmware, and get shell access to the Mekamon. Reset firmware, then sniff new firmware update as it is downloaded from the IOS device to the Mekamon. Break out firmware image, find where passwords are stored, and activate a shell. Re-flash with modified firmware via BLE. Pros: doable, but easy to go down a rabbit hole and could take a long time. Vulnerabilities could also be patched and I’d be back at square one.
  3. Reverse engineer communications bus and the digital/SPI/I2C protocol to control the leg motors. Pros: This would allow me to put the Mekamon legs on anything, and have basically unrestricted access to movement. Downsides: I won’t have access to any of the pre-programmed movements that the Mekamon already has, but that’s okay.

Hacking attempt #1 — Sweeping the Leg

Never one to pass up a chance to tear down electronics, let’s try what is likely the hardest option (#3) first. Goals: Attempt to identify the protocol being used by the Mekamon by connecting an Oscilloscope and/or Logic Analyzer to monitor the pins going to each leg. From there, I can possibly reverse engineer the protocol (SPI or I2C?) and figure out how to control the motors on the legs directly.

The wires going to the leg are not labeled on the PCB, but I was able to take the leg apart and map out the pins, and then listen to the signals going over the wires with an oscilloscope (see below).

A standard 5x2 I2C ribbon cable with 1mm pitch fits to the main board after removing the polarity notch from the cable. From what I was able to see with the oscilloscope (not a full-out logic analyzer), it appears that there are 3 sets of positive/negative power going to each leg, presumably for each motor, 3 control lines using an unknown protocol to communicate (probably I2C or SPI), and one unused pin.

As you can see from the video above, the shoulder motor uses pulse-width-modulation for control of the motor, and a rotary sensor to track position. The other motors are controlled via a second SPI/I2C interface. So to make this work, I’ll need a logic analyzer to reverse-engineer the different I2C protocol going to motors 2 and 3 above, as well as adding a micro controller to control the PWM and read the rotary position sensor for motor 1 in the shoulder.

Thank you, next. Getting in through Bluetooth is looking better and better! May come back to look at this later…

Attempt #2 — Bluetooth for the win!

Below is an input/output graph from sniffing the Bluetooth Low Energy data communications between my iPhone and the Mekamon using Wireshark and an Adafruit Bluefruit LE sniffer. If you’re trying this yourself, use the display filter `btatt.opcode == 0x52` to filter in on data messages using a Bluefruit LE sniffer v.1.

Below is the Wireshark I/O graph of me connecting Mekamon app on my iPhone to the robot (filtering out broadcast messages). You can see a small burst of 3 initialization messages, followed by a lot of traffic a few seconds later.

So for next steps- we’ll take a look at these messages, extract the message payloads and replay them to see if we can get control of the robot. If we’re successful, we’ll reverse-engineer the movement commands and build a library in Python to control the robot.

Bingo! Screen shot of 1 out of 2 initialization messages [ 0307010c00] required to control the Mekamon.

After replaying the commands above, the Mekamon is in “free drive” mode and waits for new instructions over Bluetooth. Nice! Reversing the communications protocol was not so easy. I wrote a wrapper for tshark that dumped all message payloads to the screen as I ran through joystick inputs (forward/back/left/right/strafe/up/down). Using my logging app and testing a single degree of movement at a time in free drive mode, it was pretty easy to identify and label motion commands- for example:

02 06 01 01 01 0C 00 # corresponds to “Idle”
02 06 01 02 50 5C 00 # corresponds to “Turn”
02 04 03 07 78 89 00 # corresponds to “Stand up”
02 04 03 07 28 39 00 # corresponds to “Sit down”

My initial guesses were:

  • [Byte 1] packet header
  • [Byte 2] motion command
  • [Bytes 3–5] signed int bytes for x, y, z axes with up to 256 degrees of speed
  • [Byte 6] CRC
  • [Byte 7] End of packet

But, I was stuck on understanding what the middle byte values are, which are confusing because 2+ axes values can change values, even with only a single movement such as “turn” or “forward” selected in the app. Fortunately some Google searching brought me to an augmented reality Mekamon project by Wes Freeman where he figured out that the Mekamon protocol is using the Consistent Overhead Byte Stuffing (COBS) method for framing packets. From Wes’s blog, here’s a example motion command:

02 06 01 01 01 0C 00
where [02] is COBS header, [6] for motion, [01 01 01] for COBS zeros on the 3 axes [0C] checksum, [00] is the terminator.

This answers the question of why “01 01 01” for the x, y, and z axes above corresponds to idle, or why “01 02 50” corresponds to turning to the right. COBS replaces zero “00” data bytes in messages with non-zero values, as “00” is reserved for packet boundaries. The Checksum (byte 6) is a pretty simple checksum of the two’s complement of the sum of all previous bytes integer values modulo 256, plus one.

# For 02 06 01 01 01 0C 00 corresponds to “Idle”$ python3
>>> from cobs import cobs
>>> idle_command_cobs = b’\x02\x06\x01\x01\x01' # signed byte ints
>>> [x for x in cobs.decode(idle_command_cobs)]
[6, 0, 0, 0]

So X,Y,Z all equal zero for Idle, which makes sense. Now to validate our hypothesis with the “turn” command…

# For 02 06 01 02 50 5C 00 corresponds to “Turn right”$ python3
>>> from cobs import cobs
>>> turn_command_cobs = b’\x02\x06\x01\x02\x50'
>>> [x for x in cobs.decode(turn_command_cobs)]
[6, 0, 0, 80]

It’s pretty clear that the Z coordinate here maps to turning, with an integer value of 80 in the example above (-128 to 127 are the possible values with a signed integer byte). Awesome!

With that context, it was pretty easy to reverse other Mekamon movement commands for movement, height adjustment, and custom movements that are going to be super important for a later phase of the project- creating dog-like movement.

# 02 04 03 07 78 89 00 corresponds to “Stand up”
# 02 04 03 07 28 39 00 corresponds to “Sit down”
$ python3
>>> from cobs import cobs
>>> stand_command_cobs = b’\x02\x04\x03\x07\x78'
>>> sit_command_cobs = b’\x02\x04\x03\x28\x39'
>>> [x for x in cobs.decode(stand_command_cobs)]
[4, 0, 7, 120]
>>> [x for x in cobs.decode(sit_command_cobs)]
[4, 0, 7, 40]

I made a Python library to simplify sending motion control commands to Mekamon over a UDP listener, and a basic keyboard controller to use in the next phase of the project (or let you control your Mekamon via a keyboard)- check it out on Github at:

https://github.com/zredlined/control-my-mekamon!

Here’s a full walk of project Mekamon using my API and keyboard controller.

https://github.com/zredlined/control-my-mekamon

Check out part 2, where we add computer vision and autonomous movement using Apple ARKit!

--

--

Alexander Watson

Co-Founder at Gretel.ai, previously GM at AWS. Love artificial intelligence and security. @alexwatson405