Coding through Games and Beat-Making
Music, technology and science hold parallels that are currently being explored in education, research, hardware and software, as well as the arts — New Media Arts, and Contemporary Modern Art.
In education, new movements created by educators and pedagogues, such as Deeper Learning and Blended Learning are designed to engage students using technology, and challenge them to “learn how to live and learn”.
No wonder, since Google has changed the way we research, study and learn. We are manipulating technology in every aspect of our daily lives and to dig deeper is to learn how to learn.
- Deeper Learning is aimed at students “drinking from a deep well of knowledge rather than sipping from multiple wells”.
- Blended Learning uses a combination of online digital media/resources with traditional classroom methods.
Through cross-curricular education we can use small streams, like art and music as a compass directing us through a vast ocean of knowledge, while using technology as the vehicle that keeps us afloat. We can start to see a common thread between these new forms of education, technology and music to lead learners toward more relevant, engaging applications of science and coding.
I went to a specialized private music college, Berklee College of Music, to find my passion and specialty. I would like to pass on this information to anyone interested in the intersection between coding and music and how to discover the hidden streams that can lead you to vast worlds, specifically coding via music production and beat-making.
Growing up I always had a design centric view of the world. I became an avid video gamer in 6th grade and would try to make my own levels using map editors or try to make my own mods (new games built on top of existing games). Now, these game mechanics are built into game designs. Minecraft for instance is a huge open world where the players can build whatever they want.
Around that same time in 6th grade, my father gave me a Roland keyboard and a copy of Steinberg Cubase, but I was uninspired by it’s interface and workflow. Being a visual learner, I loved to challenge myself by using software that helped facilitate different workflows and layouts, and thus different ways of thinking. I made plenty of beats with that setup but it was eventually abandoned, so I could level up my character in Diablo 2 and beat my friends at Starcraft.
It wasn’t until 3 years later, when I discovered Propellerhead Reason, did I really catch the beat-making bug. Instead of staying up late at night playing games like Starcraft and Diablo 2, I would stay up making beats — because Reason was a video game to me, or at least it looked like one. A video game where I designed the rules and I played against myself — an audio/video game that helped me create. Reason helped visualize composition, and music production differently and helped facilitate ideas when I had none. It felt even more unstructured than the open virtual worlds of video games and that was the freedom I sought through mods and map editors.
When I went to college at University of California, Santa Barbara, I was obsessed with making beats. Two years later I left the sun drenched beaches of Santa Barbara for the arctic winters of Boston at Berklee College of Music. Looking back, I would attribute Reason as a huge factor in that decision.
At Berklee I took an Ableton Live class and was hooked thereafter. One of my professors at Berklee would tell us that:
“These courses are only introductions to these topics — to master these skills you will need to go off on your own, to experiment and create.”
More than anything, Berklee helped guide me through the vast field of music technology to find my passion at the intersection of coding and music.
During my 2nd year at Berklee in the Electronic Production and Design department, I took classes in CSound and Max/MSP which introduced me to coding concepts such as abstraction, classes, objects, and basic types using music as a vehicle to deliver those ideas.
Once Ableton Live introduced Max4Live I was reliving my dreams of map editors and mods. Max4Live allows users to create their own instruments and effects and extend the Ableton Live software in whatever way they can imagine. Max/MSP (Max4Live) is a graphical programming environment so it fit with my visual learning style. It also enabled me to visualize the control flow of my favorite components of music production — such as recreating a synthesizer using its smaller modules.
For my Electronic Production and Design degree thesis project, I created a video game using Ableton Live, Adobe Flash (ActionScript 3 programming language), and Open Sound Control.
That was it for me — I was hooked from there on with coding. Coding and music are such wide and diverse areas — Music and Neurology, Music Information Retrieval, Machine Learning, new ways of performing and DJing, new instruments, and new sounds. (Side note: NASA sent a gold record “Best of Earth” mixtape into space!)
After college I took a few online classes on Youtube, Lynda, and City College of San Francisco, to learn some of the basics of iOS programming as well as read blogs and articles I found on Google.
For my first iOS app, I used an audio library called libPd, which is an embeddable version of Pure Data. Pure Data is an open source version of Max/MSP, both of which were created by the same person, Miller Puckette. Using Pure Data, I was able to bridge my Max/MSP experience and my knowledge of audio, with this new world of iOS programming. I also used Cocos2d which is a game engine using the Objective-C programming language. Fortunately, Xcode (the IDE/software you use to make iOS apps), had a graphical interface called Interface Builder.
My latest app Slicr is a remix production tool used to slice samples. Slicing is a common sampling technique used by modern electronic musicians to remix and reuse elements from an existing recording. Producers will often slice and sample “breaks”, the isolated drum and rhythm sections of recordings. For Slicr, I used purely Xcode, Swift and Ableton Live to create the sound library.
Check out my app Slicr here: https://www.beatshoplabs.com/slicr
Voila! That was my path from addicted video gamer to Beat-maker to Coder using music and technology as a vehicle. But this is only one possible path… There are many possible pathways that are still begin unraveled, explored and facilitated through new ideas that don’t rely on archaic models of learning in classrooms.
If you want to bridge your interests with your existing skills, there are multiple ways to proceed. One take away from Hip-Hop culture tells us that apart from the 4 main elements (Breaking, Djing, MCing, Graffiti) there is a fifth element; Knowledge, and more specifically knowledge of self. Know what kind of learner you are. If you like to learn visually — seek out visual tools. If you learn aurally — seek out podcasts, listen to Youtube tutorials without the video open, if you are a tactile learner save up for some hardware, take drumming classes, if you prefer in-person learning find friends that want to learn too, go to meetups, take workshops, etc.
Even in “open virtual worlds” there are limitations — the rules that you have to play by. Once you understand the basic language of creation you can freely extend the world and its rules. Oh sure, there are still rules when you dig deep down, but that is the quest of living and learning — to dive into these vast oceans of knowledge and hit the crust at the bottom, then wonder what is beneath. What lies beneath are connected pathways that can be applied to any subject area.