Music and AI
Will Artificial Intelligence augment or disrupt the role of the composer?
PART 1: Resuscitating J.S. Bach
The music industry gets disrupted often. Industry 3.0 was the domination of automation, computers and electronics with composers like Stockhausen and Xennakis, as well as many others, experimenting with electronic sound within the context of music. But who will shape the creation of music in Industry 4.0? Will it be the large Googles, Apples and digital platforms that provide the tools for anyone to create music using artificial intelligence by broadening and augmenting the role of the composer? Or will society reject the role of Artificial Intelligence (AI) in creating music and hold to the cultural and national place of composers that has reigned for hundreds of years ?
As a musician, trained in classical music performance, analysis and composition, I find a strong emotional and internally conflicting response to calling creators of music using artificial intelligence “composers”. Apparently, I am not alone either. David Cope, the American scientist, composer and former Professor of Music, knows all too well as the developer of Experiments in Music Intelligence (EMI) software and Emily Howell project. Cope has experienced two extremes. On one hand, there has been a lot of interest in AI generated music composition over the past two decades with praise from the scientific community. But the music industry had a different viewpoint. To quote Cope:
“A number of big-name classical performers expressed interest, but their agents wouldn’t let them touch it with a ten-foot pole, citing industry controversy over the work. They thought it would blemish the name of the performer”.
There are many digital tools that could potentially disrupt the role of the composer. Over the past year, I have tried different AI Music programs and have felt fairly skeptical that AI might be replacing the art of composition anytime soon. I had fun with the Bachbot — which played samples of Bach, one original, the other generated by AI — and Impro AI -Musico generates endless streams of repetitive music and beats with a game-like user experience. There are many more applications and machine learning projects that explore how to create a musical experience using AI, and until recently, I felt pretty safe to say, machines would not be replacing the creativity of humans any time soon.
That was until OpenAI released its prototype MuseNet in May 2019.
In this blog, I will explore the capabilities of OpenAI’s MuseNet with a focus on the style of J.S.Bach. Using a series of music demonstration experiments, I’ll reflect on the extent that AI technology is likely to drive composers and creators of music, and to adopt AI applications in their music making.
J.S. Bach’s music is highly complex in terms of its modulation formula, the use of both tonal and real fugue formulas and complex rhythms through polyphonic voices. However, thanks to extensive MIDI transcriptions of Bach’s compositions, there is sufficient data that can be used to train machines to compose like Bach. The aim of this experiment is to test the capabilities of the musical machine and compare the extent to which human creativity plays a role in generating Bach like music. All experiments will use the advanced settings on MuseNet using the style: Bach, with no seed notes for introduction. The piano will be used as the testing instrument throughout all experiments.
Our first demonstration clip uses eight AI generated segments to construct Bach. You can hear the except below:
Analysis: While there are some Bach-like moments, the machine has trouble knowing how to start and therefore it sounds like the music starts in the middle of a piece. Modulations and rhythm were somewhat erratic, and it seems that the machine has not been able to sufficiently predict patterns in Bach’s music and make a Bach-like phrase.
Verdict: Limitations are evident in the pattern prediction and structure, but stylistically relevant.
Let us try another simulation, this time taking the time to select an opening that starts well. This means reseting the MuseNet API a few times until we find an opening musicial segment that sounds like an opening segment of a J.S. Bach composition. Our next except is here:
Analysis: With careful choices on the excerpt options, a Bach-like composition is possible. The clip has a clear start and phrases are constructed. The clip starts so that the rhythm is stable and the modulations and harmonic structure is quite similar to Bach. As the clip develops, the pitch range is extreme and uncommon for Bach who wrote the for harpsichord, however these extreme ranges work well for the piano. By the time the fifth segment is generated, the piece starts to lose its consistent structure and the Bach feel is lost — the MuseNet algorithm is limited when it comes to a complex musical transition and/or ending. Verdict: When it comes to J.S. Bach, the ability of the MuseNet as a composition tool requires human intervention for starting and ending and creating appropriate transitions in the musical composition process. Once the start of the AI-generated musical segments is decided, complex stylistic features in harmony and modulation are possible with AI.
Each of the default clips above, and the subsequent clips generated with the MuseNet API have clear stylistic musicial features that make the music distinguishabley trained on J.S. Bach data. Providing options for human judgement and choice so that phrases can be developed is a strength of MuseNet, which also can support the creativity process.
In this case, listening and deciding on the musicial segments each time an excerpt is generated requires solid knowledge of J.S. Bach’s musical features to choose the best segment to complete the phrase. The MuseNet experience is like attempting to put a Bach jigsaw puzzle together without knowing how the picture will look or how many pieces the puzzle has.
I do think this prototype of MuseNet can be a novel way to explore Bach like music, and can empower those with basic music knowledge to develop musical creativity — what will determine signficance is what the MuseNet user chooses to do with the creation made.
You can find more information about the online lecture Exploring Artificial Intelligence in Classical Music on Youtube here.
You can follow DAIN Studios on LinkedIn here.