Robot Music Through the Ages

Synced
SyncedReview
Published in
7 min readJul 24, 2019

It soothes us when we’re stressed; it walks a bride down the aisle for a wedding ceremony; it pumps out the beats that bring a dance floor to life. Music is all around us, and the soundscapes created by its notes and tones can even change our behavior. Music has long been an essential human art form, central to our cultural activities and social development. But that ad jingle you’re humming, the spooky soundtrack in the horror film you’re watching, or the catchy pop tune on the radio may not be the product of humans at all. More and more of today’s music makers are actually robots.

With recent developments in artificial intelligence and automation in machines, robots are advancing into previously unexplored industries such as music and entertainment.

Music and Machine

The player piano is often referred to as the first robotic instrument. It is however useful to trace the link between music and machines further upstream. In the third century BC, Ctesibius of Alexandria invented the first keyboard instrument, the Hydraulis. A series of semi-submerged panpipes leveraged water pressure to sound the organ effectively and evenly via windchest and keyboard. This well-engineered breakthrough instrument lead to the first century AD mathematician Hero of Alexandria’s bird whistles — water-flow powered pneumatic birds that sung pre-sequenced musical notes.

Around the eighth century AD, barrel organs rolled onto the music arena. Perhaps inspired by Hero’s pre-sequenced birdsong, barrel organs played a series of notes triggered by metal pins and staples embedded in a rotating barrel. This innovation paved the way for automating instruments by using punch cards.

Centuries later, the player piano finally stepped onto the stage, and was a massive hit. The earliest record of player pianos dates to the eighteenth century, and in 1919 more player pianos were mass-produced than traditional pianos. But why did the evolution of musical automation settle on this particular instrument?

To answer that question we need to introduce Joseph-Marie Jacquard, whose 1801 programmable loom forever changed textile production. The loom’s operation was directed by punched paper cards, enabling it to produce complicated patterns beyond the capabilities of previous weaving equipment. (Punched cards would later shape data input and storage in early computing, for example in the UNIVAC, and the ENIAC.)

Punched cards and rolls worked perfectly with pianos — all musical notes were easily assigned to perforations to connect the automated mechanical system with musical data. Later, “reproducing pianos” added features such as volume and attack to the punched paper rolls to further improve realistic musical generation.

From this foundation in pianos, robotic music expanded to include percussion robots, string robots, and wind robots.

Robotic musical instrument VS Musical robots

In academia a “robotic musical instrument” was typically defined as a device that could produce musical sounds automatically by leveraging mechanical systems. Gil Weinberg’s 2005 innovation advanced the definition much closer to current thoughts on the topic.

A professor of musical technology at Georgia Tech and founding director of the Georgia Tech Center for Music, Weinberg designed the humanoid drummer Haile along with Scott Driscoll. Leveraging multiple musical algorithms and a pair of solenoid-driven arms, Haile is a revolutionary, improvising robot that can listen to a musical performance and then join in and play along — Weinberg termed it an “automated mechanical sound generator.”

The idea of machines creating music rather than recreating musical sounds led to further questions: How can human musicians interact with robotic musicians? What if human players are entirely removed from the scenario?

The Toyota Robotics Research Division showcased five humanoid musical robots at the 2005 World Expo in Aichi, Japan. The Toyota Partner Robots musicians each had unique abilities — Version 1 for example used its humanoid lips and fingers to play the trumpet. These robot band members were however primarily designed as home companions — their advanced motor coordination skills aimed more at performing household chores than playing jazz.

Haipeng Mi is an associate professor at Tsinghua University’s Academy of Arts & Design and project leader and chief scientist of MOJA, the first traditional Chinese music robot band. MOJA has three robot musicians named after celestial bodies — Yuheng, Yaoguang, and Kaiyang — who play the bamboo flute; the konghou, a traditional plucked stringed instrument; and drums.

Mi explained the band’s creative evolution in a conversation with Synced: “We initially designed the robots to mimic how human musicians would play an instrument, but encountered problems. Even if the robots performed well the music they produced didn’t sound very human. But because robots can play any instrument, why do they have to be limited to instruments played by humans? Why there can’t be a scenario where robots become good at playing certain innovative instruments that humans are not good at? Maybe in the future, robots will play their own unique instruments.”

MOJA music composer Yuyang Hu notes that although a human bamboo flute player can use professional techniques for example to extend their breath, the air pressure and airflow produced via human lungs and mouths will still inevitably fluctuate. Although consistency is simple for a robot, human instrument-playing techniques such as affecting airflow by using the tongue or changing mouth and lip positions to subtly shape tones remain beyond the ability of today’s robots.

Mi told Synced he also participated in research and development of a novel robot rock band during his studies in electrical engineering at the University of Tokyo in 2012: “At the time, I was already very interested in applying robotics to a completely different field.” Mi says he shifted his focus from rock to Chinese traditional music as he prefers the latter’s charm and artistry.

The robot rock band Mi refers to is Z-Machines, which headlined at Maker Faire 2013 in Tokyo. The hardcore band was led by a two-meter tall, 78-finger dreadlocked guitarist Mach. The drummer meanwhile had 22 arms, and the keyboard player, lasers. Yuri Suzuki Design Studio created the Z-Machines “to perform beyond the capabilities of the most advanced human musicians.” Z-Machines even collaborated on an EP called “Music for Robots” with British electronic musical innovator Tom Jenkinson, commonly known as “Squarepusher.”

The development of robotic instruments and music owes much to the iconic German band Kraftwerk, which pioneered the modern “robot pop” musical genre. Born in the ’70s, Kraftwerk combines electronic music with pop music elements such as repetitive rhythms and melodies. The band continues its homage to automaton in live performances with identical suits and minimal movements, and had mechanical replicas of band members perform their hit song “The Robots” onstage during their 1981 Computer World tour.

And then there’s the touring grindcore band Captured! By Robots, which features the robot GTRBOT666 on guitar and bass and DRMBOT0110 on drums. The band’s single human member, JBot, explains: “I couldn’t play with humans anymore, humans have too many problems, like drugs, egos, girlfriends, jobs… I figured I could make a band that I could play with until I die, and not worry about if anyone in the band was going to quit and kill the band.

For human-free musical performance considerthe “socially interactive and improvisational robotic marimba player” Shimon, which Gil Weinberg introduced in his 2009 paper Interactive jamming with Shimon: A social robotic musician, accepted by the ACM/IEEE International Conference on Human-Robot Interaction (HRI). In 2017 Shimon was trained on some 5,000 popular songs and two million musical motifs, riffs and licks to learn to compose its own music.

Will robots some day outperform humans on the task of writing and performing hit songs? Other recent and rapid tech improvements made possible by machine learning suggest yes, Shimon may soon sing its own version of Drake’s “Started from the bottom now we’re here.” Says the bot’s co-creator Mason Bretan: “Shimon already has four arms and can hold eight mallets, so it can already do things a person can’t.”

Machine learning is also changing how soundtracks are made. Luxembourg startup Aiva Technologies has emerged as a leader in this field with their Artificial Intelligence Virtual Artist app. Trained on massive amounts of classical music and leveraging reinforcement learning techniques, Aiva produces sheet music tailored for film, video games, commercials, etc.

Human composers and musicians however appear to be safe for the time being. For most researchers and musicians who are developing music-generating machines the goal is not to replace humans, but rather to push the boundaries of mechanical abilities and machine intelligence to explore new possibilities in this important, age-old creative arena.

Journalist: Fangyu Cai | Editor: Michael Sarazen

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Follow us on Twitter @Synced_Global or daily AI news!

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global