Give Web Developers a Code Editor and a Trumpet, What Happens?

Not What I Expected: My Web-Midi Demo in This Year’s Chrome Dev Summit

It all began when I failed to build a Trumpet Playing Robot in Geekcon. Shortly after, I was invited to present a demo at the Chrome Dev Summit, and thought this would be a great opportunity to explore a new API I always wanted to play with (literally) — Web MIDI. The result was not quite what I was expecting, as you will shortly discover.

MIDI Keyboard and a light bulb?! Read on to find out

Last year, I presented the In-Real-Life Chrome T-Rex Game in the same event. It was a fun experience, and I learnt a whole bunch of new skills. It was, however, a huge undertaking for me — I spent a month of my life trying to make that happen, and fixed issues with the hardware and bugs in the software up to the very last moment, in the hotel room the night before the conference.

I didn’t want to repeat the same experience this year, so I tried to go with a less demanding project this time. After spending some more time tinkering with the trumpet and even creating an JavaScript and Angular powered device that would measure the air pressure, I realized I was heading exactly to the same direction as the last year — much work, high risk, very demanding project. So I sent the conference organizers an email:

This Is Not Going To Work!

After explaining the situation in short, I presented them with two options:

An hour later, Robert Nyman replied:

So my goal was set to provide the “Best Experience”. Challenge Accepted!

I spent the next few weeks building the electronics for the trumpet, designing a 3D-printed finger mechanism for it, and even coming up with a Web-Audio based sound engine for it. The project still consumed much of my time, but it also provided me with a bunch of post topics for my daily blogging challenge.

Fast forward to the days before the conference. It was just after AngularConnect, and I was travelling in the airplane to the bay area. At that time, I had all the electronics for the project ready and packed inside my luggage, but there was one missing piece —

There Was No User Interface

I spent so much time getting the hardware to work that I totally neglected the user interface part. I only had a simple web page that provided a piano-like interface:

Even wondered how to create a piano keyboard in HTML? Check out the source code

This was obviously not the “Best Experience” for the Chrome Dev Summit attendees. As I had WiFi in the plane, I decided to spend my 10 hours flight working on that. I started by asking myself: what would be the best user interface for the summit attendees — developers?

The Best User-Interface for Developers, Code Editor

The goal of the project was to educate developers about the Web MIDI API, so it made a lot of sense to show them the JavaScript code that powered the demo. But then I thought — why only see the code?

Code editors are where most developers feel at home. I decided this would be the user interface for my demo, so the attendees will not only see the code behind it, but will also be able to tinker with it and save their modified versions. Then, by the end of the conference, I will have a gallery of user-contributed code snippets that play different tunes. Or so I thought…

I spent the first few hours of the flight researching different solutions. After comparing several alternatives (such as codemirror, which I also used for tsquery-playground), I decided to go with Monaco Editor, which is the editor from Visual Studio Code. It provides code auto completion and types checking (thanks to tight integration with TypeScript), which is essential when exploring a new API.

Exploring new APIs is a breeze with auto complete

You can try it yourself by using the simulator version. This is a special version that simulates the Web MIDI API, so I could test the code while without a real MIDI instrument connected to my computer (obviously, playing a real trumpet while in the plane wouldn’t be practical. Or would it? 😉).

A couple of hours later, the project was more or less ready, and I landed in San Francisco. It was probably the most productive flight I had so far, and surprisingly I managed to squeeze nearly 1GB of data out of the plane’s WiFi (400MB of which consumed just by npm).

You can find the complete source code for this project on GitHub, including some tricks I employed to visualize the code execution (more on that in a future post).

When I got to the hotel, I quickly unpacked and started assembling the project in my hotel room:

Setting up in the hotel room: 2 artificial fingers ready, one more to go!

A quick dry-run revealed that despite the long journey, everything was still functional, and worked well with the editor I hacked together during the flight. Ready for the prime time!

The Conference Day

Finally, the big day has arrived!

I was really curious to see whether the attendees will take on the opportunity to write some code for my demo. So I arrived early at the conference venue to set everything up:

Debugging is always more effective with your eyes closed

Despite having worked perfectly in the hotel, there were some issues with the getting the Raspberry Pi connected to the conference’s WiFi, but thanks to the very helpful event organizers, an Ethernet cable was provided and an hour later everything was ready. Then, came the first surprise:

A MIDI Keyboard

My friend Lars Knudsen also attended the conference, and as he knew about my project, he decided to surprise me and brought a MIDI keyboard 🎹. Then came the brilliant idea of creating a JavaScript code snippet that would connect the keyboard with the trumpet — You play the keyboard, get the sound out of the trumpet:

Lines 9–10 are where the magic happens

and the result:

Ken Franquiero playing Doom E1M1 on the trumpet using with Midi Keyboard 🎶

MIDI All Things!

But it didn’t stop there. He also “happened” to have a MIDI Saxophone (I didn’t even know such thing exists), so he connected it too, and now he could play the Saxophone and get the sound out of the trumpet:

Just when I thought I have seen it all, Ruth John suddenly appeared with a mysterious MIDI device and plugged it into the demo:

Ruth trying to figure out how to connect her MIDI device to the demo

A few minutes later, her device was controlling the trumpet, too:

That was definitely an unexpected twist in a plot! But what happened with my original hypothesis? Did people actually write code to play songs?

Developers with Code Editor and Trumpet 👩‍💻🎺

When I wrote the “Web MIDI Playground”, the code editor for my demo, I included a sample code snippet that plays a few notes, so others could use it as a starting point. Lars took the opportunity to code “Hava Nagila”, which you can listen to in the online simulator.

Next, Shmuela Jacobs coded some song her baby loves listening to:

Pair programming is always more fun! 🎶

She hit “Run!” and started moving to the tune:

This was the first program she wrote especially for her baby! Check out the code

She also told me that it was her first time writing code with async / await, and she really liked how straight forward it was. The other attendees played her code snippet more than 50 times throughout the event!

So yeah, people were actually writing code to play some music during the conference. What I didn’t anticipate was…

Developers Love Abstractions

Markus was a Swiss guy I met in Espruino forums after sharing the experience of bringing back a fried Espruino board from the dead. After he learned about my project, he volunteered to help with manning the booth during the event.

He arrived shortly after we connected the MIDI keyboard to the project:

Somehow an “Edge” sticker made it to the electronics box of the project

As you would expect from a seasoned developer, he immediately recognized the opportunity to create an abstraction layer, and spent the next two hours working on this piece of code, creating an interpreter for some text-based music notation format he came up with:

Markus‘s musical notation format: Note name and octave, followed by duration and lyrics

Markus also included the lyrics (as you can see above), and interpreter displays them on screen as the song is playing, turning the project into a trumpet-karaoke machine:

The next surprise was the brainchild of Reilly Grant, a software engineer on the Chrome team. We met in the summit a year before, and this year he also volunteered to help manning the demo booth. After observing many attendees play melodies using the MIDI keyboard, he came up with a brilliant idea:

Abstracting Away The Code Editor

He spent about an half an hour writing a “Composer” snippet, that would record whatever you played on the keyboard, and would spit out code that plays the same thing. You could then copy this code and save it in your own snippet.

My original intention was to have the attendees write code to play the trumpet, but thanks to Lars’es keyboard and Reilly’s piece of code, they now had an interface where they could create the code without actually writing it. It was all abstracted away!

This resulted in a bunch of new code snippets that were recorded by the attendees, such as the following:

But the fun didn’t end here. Reilly, who remember my magical battery powered smart light bulb from the previous year, asked me if I had it with me. Luckily, I did, so we proceeded to combine the Web MIDI code together with Web Bluetooth:

Let There Be… Light!

We wrote a short piece of code that uses Web Bluetooth to connect to the bulb and light it whenever you played a note. About 15 minutes later, he had this working:

Web MIDI and Web Bluetooth working together!

So now you could also make a disco light show while playing the trumpet with keyboard!

Developers Are Full of Surprises

When I started working on this project, I didn’t really know what to expect. Projects presented at conferences are usually watch-only or have limited ways you can interact with them (like the T-Rex game last year). This time, I gave the attendees the full power of a code editor, and I wasn’t sure where they will take it.

I was really blown away by all of different ways the attendees programmed my robot and connected it to all sort of other MIDI devices, as well as created interesting abstractions on top of the code editor. Thank you!