Melody Mixer: Using TensorFlow.js to Mix Melodies in the Browser

Torin Blankensmith
7 min readMar 15, 2018

--

It’s becoming easier for any coder to tinker with machine learning, even if you’re not a machine learning expert, thanks to tools like TensorFlow.js. In this article, we’ll show you an example of what TensorFlow.js makes possible through a tool called MusicVAE.js. We’ve combined MusicVAE with easy-to-use web tools P5.js and Tone.js to explore musical melodies in a new experiment called Melody Mixer.

This experiment came from a simple idea: what would it sound like to blend between two different musical melodies? What if we asked a computer to start with one melody and end with the other? What melodies might we discover along the way?

Melody Mixer uses MusicVAE.js, a web framework released by the Magenta research team at Google. It lets us combine and transform two different melodies by blending them at any percentage. MusicVAE.js does this by running a deep neural network locally in your browser, using TensorFlow.js.

Open this link to try out Melody Mixer. First, listen to the two separate melodies. Next, click and drag the melodies apart, and listen to what you get. The computer has morphed from one melody to the other — using musical knowledge it has learned from analyzing 28 million different melodies — all real-time, locally in your browser. (If you want to dive deeper into the details of how the machine learning works, check out this blog post).

OK, now let’s walk through how this project was made. We’ve created some demos that make it easy for you to get started building your projects. First, we’ll learn how to set up MusicVAE.js, then visualize the melodies using p5.js and finally play back the melodies using Tone.js.

You don’t need to have any experience with machine learning to do this, but you should have basic knowledge of web development.

Tools we’ll be using

P5.js: A library that helps make coding accessible for artists, designers educators, and beginners. We’ll be using P5 to create interactive graphics.

Tone.js: A framework for creating interactive music in the browser. We’ll be using this to play back the melodies in the browser.

TensorFlow.js: A hardware-accelerated machine intelligence library for the web. Essentially this means we can write javascript commands that will run directly on the GPU. We won’t be working directly with TensorFlow.js, but it’s the underlying tool that allows MusicVAE.js to run a machine learning model in real time in the browser.

MusicVAE.js: Magenta’s new library that allows us to blend between melodies and also sample / generate new melodies. The library also works with drum beats, but we’ll be working with melodies in this demo.

Demo1: How to use MusicVAE.js

In this demo, we’re going to walk through how to initialize MusicVAE.js and interpolate between two melodies. Step one is to send MusicVAE.js the notes for each melody. We’ll start off by exploring the note sequence format that MusicVAE.js takes and highlight a couple of helper functions that make the process easier.

Note sequences

Let’s start with a little melody. Here’s how it looks like as sheet music:

You can hear it here:

And here’s the same melody visualized as a piano roll:

Now, here’s how this melody is passed into MusicVAE.js. For each note in the melody, we specify the pitch (notice how they correspond to the piano roll above) as well as the timing, indicated by start and end slots.

MusicVAE.js expects us to cover 32 slots, but notice how some of them can remain empty, creating a silent moment in the melody. Also notice how notes that are held longer use up more than one slot. In our example above, we’re only passing in nine notes, but some are longer, so we’re using 32 total slots.

Pitch-wise, MusicVAE.js allows us to specify 88 notes — just like on a piano. The lowest possible note is 21, and the highest is 109. The melody interpolation in MusicVAE.js is monophonic, meaning we can only play one note at a time. Length-wise, the shortest a note can be is one of the 32 potential slots, and we can give a note any duration as long as it doesn’t overlap with another note, or extend beyond slot 32.

Now that you’ve seen the structure of the note sequence, let’s feed the melody to MusicVAE.js and get back a morphed melody.

Initialization

With musicVAE initialized, we can now use the interpolate function. We do this by passing in an array containing the two melodies that we want to blend, then specifying the number of interpolations we want back.

Let’s start by creating two melodies, just like the format above:

The smallest number we can use for numInterpolations is 2. With this, we would get back our original two melodies. If we set numInterpolations to 3, we get back MELODY1, then a new melody blended halfway between MELODY1 and MELODY2, followed by MELODY2. As we increase numInterpolations, we get back more sequences in between MELODY1 and MELODY2, which creates a smoother transition.

Next, try changing numInterpolations using the steps in Demo 2 and see how the notes change.

Demo 2: Displaying the results

At this point the interpolatedNoteSequences variable holds 3 melodies in this sequence:

  1. Melody1 (just as we input it)
  2. New blended melody
  3. Melody2 (just as we originally input it)

Let’s use p5.js to plot out all three melodies on a piano roll. (In the following step we’ll create a playhead to play the music!)

Notice that you can ask MusicVAE for more blended blocks (each one with 32 slots), which will create an even more intricate blend between the two original melodies. We do this by changing the numInterpolations variable.

Demo 3: Playing the Blended Melodies with Tone.js

Now that you’ve seen how to set up and interpolate melodies using MusicVAE, let’s get it to play back the notes in the browser using Tone.js. We have recorded piano notes that correspond to the midi notes 21–88, so we can easily make our own piano in the browser using the sampler from Tone.js. (Alternatively, we could create a synthesizer to play back our melodies.)

Next, we will update our draw function to include the playback of our current note. Every time draw is called, we will use Tone to calculate where we are in the playback, as a percentage. We will then map that percentage to the current sequence and the current step within the sequence. Lastly, we will find the notes in the data and call playNote to send it to Tone.js.

Demo 4: Adding a bit of interaction sparkle

Demo: g.co/melodymixer

Repo: https://github.com/googlecreativelab/melody-mixer

Next Steps

We highly recommend tinkering with this example code a bit. Break it and make it do weird things!

Playing with the code will likely make you more curious about what’s happening behind the scenes with MusicVAE. To learn more, read this blog post by Magenta, the research team behind this great library.

We also recommend taking time to further explore Tone.js and p5.js, as they each have the potential to do more with MusicVAE — both musically and visually. Here are a couple of places to learn more about them:

P5.js Resources:

Tone.js Resources:

Conclusion and Takeaways

It’s exciting to see how much easier it’s become for any coder to explore machine learning thanks to tools like TensorFlow.js. We’re looking forward to seeing what others do next. If you have something to share, consider submitting it to the Experiments page so we can all learn from each other and spark new ideas for what can be made with these tools.

Acknowledgments

Special thanks to Kyle Phillips for his help making this code approachable. Thanks to Amit Pitaru and Alex Chen for their creative direction and Adrienne Le for help on this blog post. Thanks to Adam Roberts for his work on MusicVAE + training all the melody models that make this project possible. Thanks to Nikhil Thorat and Daniel Smilkov for their help getting this project running and optimized with TensorFlow.js. Thanks to Jordan Griffith for helping to make this project come together. Thanks To Glen Cochon for help with design polishing.

If you enjoyed this write up you can find more of my work at www.torinblankensmith.com

--

--