Playing Pokemon with Office Dogs and a Raspberry Pi

Sam Rubin
The Storyblocks Tech Blog
6 min readJan 31, 2020

What do dogs, Pokemon, and Raspberry Pi have in common? Nothing, until this year at Storyblocks.

Every year at Storyblocks we conduct an in-house hackathon, Innovation Day. This year, my team and I sought to answer the question: Could one of our office dogs play Pokemon?

Gameplay from Pokemon Red

We at Storyblocks love our dogs. If you don’t believe me, just take a look at our 404 page- an Innovation Day project that shipped to production. Furthermore, being a long-time fan of Twitch Plays Pokemon, I’ve always enjoyed some of the spinoffs from the iconic experiment. As a result, the idea of ‘Dogs Play Pokemon’ was born. If you want a quick showcase of the final product, check out the video below.

The Goal

Our mission was simple: Create a system that would allow dogs to “play” the original Pokemon Red video game and livestream it.

Our only stipulations were that: 1) All gameplay would occur in our office, 2) All inputs to the game had to be triggered by my dog Iggy, 3) Humans could influence Iggy, but only indirectly. What I mean by this last point is that if a human wanted to move a certain direction in the game, they would only be allowed to influence Iggy via encouraging words or a treats, but they would not be allowed to directly pick him up.

Iggy: the world’s next best Pokemon trainer

The Problem

There have been experiments to see if the community, AI, and even RNG could beat the original Pokemon Red video game. But this challenge becomes far more difficult when dealing with a dog whose inputs to the game are likely to be both completely random and infrequent (compared to computer RNG). Pokemon Red has 8 inputs: Up, Down, Left, Right, A, B, Start, and Select, and almost all are needed to beat the game. Our team sought out to semantically map Iggy’s real-world actions to those 8 in-game actions, with a particular focus on the directional inputs, since we foresaw that navigation would be the most difficult aspect of the game. We also wanted to allow for rapid inputs from Iggy, as frequently as an action per second, to drive gameplay forward. Finally, we wanted human interaction to be helpful, but not required for gameplay, so that Iggy could wander off and play in solo mode.

Implementation

For our solution, Iggy would wear a device that tracked both his gyroscopic movement and acceleration. Although this implementation might have allowed room for unintentional actions, we would be able to get rapid inputs triggered both by human influence and Iggy’s own choices. We should also be able to get enough different inputs to map to all 8 in the video game. With this plan in mind, we set out to assemble the device.

The Hardware

When deciding what hardware to use, the Raspberry Pi, together with the Sense HAT, was a perfect solution. The Sense HAT has an 8×8 RGB LED matrix, a five-button joystick, gyroscope, accelerometer, magnetometer, and various environment sensors.

We knew immediately we would be able to leverage the Gyroscope and Accelerometer, and even potentially the Magnetometer in the future. We could even use the LED matrix as a way to give feedback to humans as to what the current action is.

The USB Power Bank would allow us to power the RPi and Sense HAT for several hours while Iggy walked around. Fortunately, we had a bunch of spare company-branded ones around the office that we could swap in and out as the charge depleted.

DogsPlayPokemon — Architecture

The Software

With our hardware setup, we next began work on writing code for both the client-side (Iggy with Pi) and server-side (computer with emulation).

A python script running on the client would read inputs from the Raspberry Pi Sense HAT and send the data as part of an HTTP request to the server. The server would then map the request’s JSON payload to one of the 8 inputs of the Game Boy Color and then issue that command to the emulator running on the same server (a desktop running in the office). If you want to jump straight into the code, we open-sourced both client and server-side.

The handler for the input mapping is relatively simple.

  • If Iggy rolls (or tilts far enough) right, press Start.
  • If Iggy rolls (or tilts far enough) left, press Select.
  • If Iggy spins or turns in a certain direction that is different from his last measured direction, press that directional input (Up, Down, Left, Right).
  • If Iggy hasn’t changed direction, but accelerated in his last known direction, press that directional input.
  • If Iggy hasn’t moved on his main axis, but has on his secondary or tertiary axes (by side-stepping, lying down, or standing up), press the B-button.
  • If Iggy has not moved at all, press the A-button.

It took a few iterations, but ultimately the above configuration proved fairly fruitful for game navigation. The Start, Select, and B-button inputs are not needed as frequently in the game, so mapping them to Iggy’s less frequent actions was useful. Furthermore, the semantic directional mapping was great for when humans wanted to influence Iggy to move him along a route or even trigger certain battle actions. Finally, throughout most of the game, when walking around, pressing the A-button has no effect and was therefore a perfect map for when Iggy remained stationary, which was by far his most frequent input.

With our Pokemon Red game running on an emulator on the computer, all we had to do was send inputs to the game. We couldn’t find a good API for any of the popular emulators so, due to time constraints, we decided to trigger virtual keystrokes on the Windows OS with the emulator as the active window (unfortunately, we also couldn’t find a great way to do this on MacOS, even with AppleScript). Running it on a computer also allowed us to livestream via Twitch so the rest of the office and world could watch on a big screen in the kitchen; keeping track of Iggy’s quest at all times.

Iggy Plays Pokemon

With all of our hardware and software ready to go, we fired up the emulator, attached the client to Iggy, had lots of treats in hand, and let Iggy begin his journey.

In true hackathon style, Iggy’s prototype consisted of the device being adhered to his Halloween costume, a hot dog. This setup worked out quite well as it provided a padded buffer between the device and Iggy.

DogsPlayPokemon Client — Mark 1

Within an hour and under some human influence, Iggy managed to select a Pokemon from Professor Oak’s lab: Bulbasaur. And although he may have lost his first battle to his rival, in which Iggy only wanted to use the non-attacking move Growl (we’d like to think he did this on purpose), he managed to make his way up to Route 1 and defeat a Pidgey.

This milestone was as far as Iggy got in the game before Innovation Day came to a close, but we plan on resurrecting the idea at some point in the future, perhaps at next year’s Innovation Day. We are also confident that given enough time, Iggy would in fact be able to beat the Pokemon Red video game, if not by random actions, then at least by human influence.

Links

DogsPlayPokemon GitHub [Code]

DogsPlayPokemon Twitch

More photos and videos of Iggy in action

Virtual Sense HAT

--

--

Sam Rubin
The Storyblocks Tech Blog

Full stack engineer working to deliver engaging user experiences. Currently @Storyblocks, formerly @DeloitteDigital, @DoJ