The Web-Connected MIDI Trumpet Robot

Raspberry Pi, Teensy and Web MIDI — How I Hacked Together an Interactive Musical Demo for the Chrome Dev Summit

Some weeks ago, Ariella Eliassaf, Avi Aminov and I tried to build a robot that plays the trumpet and failed. Nevertheless, I was invited to present the robot in the demo forum of the Chrome Dev Summit, to showcase Web MIDI and how easily you can control musical instruments using JavaScript code.

Since we couldn’t get the artificial latex lips of the robot to work reliably, I went with the backup plan — using a small speaker to play the sounds, and plug this speaker right into the trumpet’s mouthpiece, to get the sound travel through the trumpet and come out of its bell.

The Chrome Dev Summit will be happening tomorrow, and I finally got all the pieces of the projects working together. Hooray!

So how does it work?

High Level Overview

The Robot is a MIDI device, which means it can be plugged to a computer (or a phone), and receive commands to play different notes. For instance, the following JavaScript code will cause the robot to play “Do, Re, Mi” (three notes) when run in Chrome:

Some magic midi constants and I’m fine

Lines 12 and 14 are where the actual MIDI messages sent to the robot, each consists of three values (a command, note index, and velocity — how low the note should be played). These values are then received by the robot which interprets them.

The robot has a 3D-printed finger mechanism which is driven by servo motors. This mechanism allows the robot to depress the trumpet’s keys in response to the MIDI commands. You can learn how I designed this mechanism in my “Designing 3D Printable Mechanisms in OpenSCAD” blog post.

The “Fingers” of the Robot

The robot also has to translate the incoming MIDI messages into sound output. It does it using a synthesizer code I hacked together using the Web Audio API:

Web Audio API synthesizer running on my laptop. Next: Raspberry PI!

The Electronics

High-level diagram of the robot electronics. Laptop credit.

The Robot connects to a computer (or a smartphone) through USB, and identifies itself as a MIDI device. A small Teensy 3.2 board takes care of speaking USB and decoding the incoming MIDI messages. It runs a bunch of Arduino code that controls the servos for the finger mechanism, and also forwards the MIDI messages to a Raspberry Pi over UART.

The purpose of the Raspberry Pi is to synthesize the audio for the robot. It runs a small Node.js server that opens a Web Socket and forwards all the incoming MIDI messages from the Pi’s UART interface to the Web Socket. It uses the serialport module for the UART part, and socket.io for Web Sockets.

The Raspberry Pi also runs headless chrome that loads a web page. This web page listens for MIDI messages coming from the Web Socket, and then uses the Web-Audio trumpet synthesizer that I mentioned earlier to create the sound for the trumpet.

The sounds comes out of the Pi’s audio jack, amplified using a TDA2030 external amplifier (I found mine as a DIY kit in a local electronics shop, you can find one too by googling for “TDA2030 kit”), and fed to the speaker.

DiY Amplifier

So in short, the computer sends a MIDI message through USB. It is received by the Teensy device which updates the Servo motors and also send a copy of the message to the Raspberry Pi through UART. The node.js process that runs on the Raspberry Pi receives this message, and sends it to the web page running on headless Chrome through a web socket, which in turn uses the web audio API to synthesize the sound. Pretty complex setup, isn’t it?

All the elelctronics — Raspberry Pi 3, Teensy 2.1, DiY Amplifier and a T-Rex Sticker

Ground Loop Noises

When I put everything together for the first time, the amplifier would not only amplify the trumpet sound — it would also amplify some annoying noise as soon as I plugged the Raspberry Pi. This noise seemed to correlate with CPU and WiFi activity.

After much googling (and frustration), I found out that this was caused by a Ground Loop — basically, using the same power supply both for the Raspberry Pi and the amplifier. Google also revealed that I would need an Isolation transformer to fix this. You can order them for about $1.5 USD from ebay, but finding one in Israel a day before leaving for the conference was a tough challenge. Luckily, some friend in the local maker community had one piece and saved the day!

This little guy saved the day!

The Editor

The idea behind presenting this project in the Chrome Dev Summit is to spread the word about Web MIDI and help developers learn about the API. Thus, I created a small editor interface that would let the attendees write JavaScript code snippets to control the robot:

I wrote this editor app in the plane on my way to the bay area. It is an Angular application. It uses Monaco Editor, the engine behind Visual Studio Code for the editor part, and Firestore for storing the code snippets created by the users.

Monaco Editor turned out to be amazing! It takes just a few lines of code to get syntax highlighting, code auto-completion, typescript error detection, code formatting and code transpilation — basically, everything you’d expect from a modern, powerful source code editor.

You can find the complete source code for this app on GitHub. I also made available all the code the powers the robot, as well as the hardware design files.

Today is The Big Day! 🎺

I’m just about to head to the Chrome Dev Summit. I can’t wait to see how people will interact with the Robot and what kind of creative code snippets they will come up with. Fingers crossed for everything to work well! 🤞

Everything ready for prime time! (in my hotel room)