Making Blinding Lights from scratch using Python!
Well, kinda.
I used AJAX Sound Studio’s Pyo library to create a version of The Weeknd’s Blinding Lights in Python. All sounds were designed and arranged in code. Here’s what I managed so far:
My motivation with this project was to explore audio related libraries and to dabble with sound design. If you are not familiar with the basics of sound design and synthesis, I recommend checking out this fun website from Ableton.
The reason I chose Blinding Lights was because the song heavily features classic synthesizer sounds. Also, I had previously attempted to make it in the DAW with Serum. I had used Serum presets for the synths and samples for drums.
With Pyo though, all sounds including the drums were designed from scratch. To keep it simple and time bound, I decided to work only on these four layers — chords, bass, drums (only kick & snare) and lead pluck. Pyo’s delightful documentation helped me get things working quickly.
Here’s what it provides APIs for:
- Pyo provides oscillator classes to design your sounds with (like the strings on a guitar, oscillators in a computer produces the base signal used to shape the eventual sound).
- It provides an intuitive instrument abstraction which encapsulates the elements of a sound including oscillators, envelopes, filters and effects (these are elements used to further manipulate the base signal to produce desired sound).
- Finally, Pyo also provides APIs to manage timing, where each element of the track can be arranged in time using its Events framework.
Here below, I will show how to use the APIs to design and arrange the catchy lead riff from the song. First, I built the instrument:
The oscillator I’m using here as the signal generator is an LFO (Low Frequency Oscillator) defined on line 5. The type argument lets me choose the type of wave — square, triangle, saw. On line 11, I describe a bandpass filter to further develop the sound. How these elements work together to determine the final sound produced is beyond the scope of this blog, but refer to the Ableton site to get a fair idea. If that’s not your style, you could adjust different parameters and see how the sound changes.
I used the events framework to arrange the sounds in time. I first described an array of notes using MIDI numbers, their duration and decay time. Then I passed these sequences to the Events object along with my instrument object, volume level and envelope parameters like attack and decay.
Finally, I get the instrument to start playing the notes using the play() function and used the delay parameter to determine when in the track these events will start playing.
The entire code including five instruments and their events can be found here. If you are so inclined, I recommend downloading the code and running it — it shouldn’t be any trouble.