Composing Music With Your Eyes

Mary Shahinyan and Jandy Le on how neurotechnology empowers anyone to create a melody.

Neurotech@Berkeley
Oct 21, 2019 · 6 min read
Image design by Neurotech@Berkeley.

Over time, humans have created compelling statements in every area, one of the best of these being music. From recognizing the melody in a bird’s chirp to creating a beat by simply snapping our fingers together, we hear music everywhere. Music is our universal language, a mean of communicating across cultures, time, and unshakable differences. It’s something fascinating, sublime—perhaps even magical.

But beyond what music does for our minds, what does it physically do for our brains? Why does music reach deep into the soul and provoke such depth of feeling? Most importantly, what are our brains like when we create music?


The specific origins of music remain unknown. However, the ancestors of humans millions of years ago were found to have innate abilities for rhythm and vocal sound patterns. Since then, humans have obtained the capacity to generate intricate harmonies from an extensive range of instruments and digital editing systems. Means of playing and distributing music have grown to such an extent that it’s become an indispensable part of our daily lives. Driving to work? Music. Showering? Music. Studying? Music. The intangible concept of music, which has evolved alongside humankind, has impacted so many lives and communities that it has become increasingly impossible to ignore the apparent connection between music and its influence on the mind.

Neurotechnology has equipped us with all the tools required to break down the magic. Brain-computer interfaces, for instance, can detect signals being sent to the brain via the auditory nerve. These signals can then be interpreted and used to either direct that information towards an external device or away from the device. If someone listens to classical music, the signals detected could be used to create a device that would send those same signals back to the human brain, and that person would be able to “listen” to classical music without actual music playing or without the ability to detect sound.

Unsurprisingly, we refer to related devices as brain-computer music interfaces (BCMI). BCMI was a byproduct of University of Plymouth professor Eduardo Miranda’s search for finding a means through which paralyzed people could make music with their eyes. He was motivated to provide physically disabled people with the ability to interact with musicians in order to create music with them.

The result of this research was Activating Memory, a piece of music that underlines the importance of cooperation and inclusivity. Its octet instrumentation is unique, to say the least—Miranda scored Activating Memory with a BCMI quartet and an ordinary string quartet. The four members of the BCMI quartet were the four paralyzed people who were creating music with their eyes. Each of these four members contributed to creating the string quartet that would then be displayed for the other four performers to sight-read (Miranda). This collaborative process of creating music provided an environment in which neurotechnology was used to bridge the gap between eight people despite their differences.

From Vimeo, by Eduardo R. Miranda. Neurotech@Berkeley publishes articles that encourage public discussion on noteworthy issues. We link this content here having judged this a good application of Fair Use Principles.

When analyzing electroencephalogram (EEG) readings, researchers are able to identify mental processes by using the power spectrum analysis, a method of categorizing the signals by different frequency bands that correlate to a person’s mental state.

Distributed under Fair Use Agreements.

Because of the number of layers between the cortex, skull, and scalp, EEG signals are generally difficult to detect. In addition, it takes thousands of synchronized neurons relaying the same signal to be strong enough to detect. Thus, these range boundaries are not clearly defined and may slightly overlap, and proper amplification systems are necessary for analysis.

Miranda’s BCMI system, linked with Activating Memory, is one such amplification system. EEG electrodes are placed over the prefrontal cortex, the region of the brain associated with emotion (and with music). With the readings from this EEG, a separate musical-generation system projects a short melody. This musical-generation system uses two parameters: one that dictates the musical process and one that specifies the tempo and dynamics of the piece.

The very first BCMI system was tried by a patient with locked-in syndrome, a neurological disorder in which all voluntary muscles, except the eyes, are paralyzed. The patient was shown a computer screen with four images, each depicting a different riff or instrument and flashing at a specific wavelength. When he focuses his attention on one target, emotional responses to the visual stimuli are categorized into regions within the circumplex model of affect, developed by James Russell in 1980.

From Miranda, Eaton, et al: AFFECTIVE JUKEBOX: A CONFIRMATORY STUDY OF EEG EMOTIONAL CORRELATES IN RESPONSE TO MUSICAL STIMULI. Fair use.

Based on the twelve categories, algorithms would alter and randomize rifts to convey the corresponding mood trajectory. Signals are analyzed before producing every musical bar, and the music is adjusted accordingly to reflect the correct tempo and melodies. This is precisely how the BCMI quartet created the sequence of music notes for the string quartet to sight-read. Each member of the BCMI quartet created a section of the music by focusing on a chosen section of the screen. The signals stimulated by the different flashing images were then captured, analyzed, and recorded via the electrode cap worn by the paralyzed participants. In turn, the music elevates the user’s mood and creates a new masterpiece at the same time.

Thus, this mechanism doesn’t involve the typical function of other brain-machine interfaces, as the person is not explicitly controlling the music in the way that one could control prosthetics by merely imagining the movement; instead, a separate system displays the music based on analysis of mental states.

People with disabilities are often excluded from the realm of creative arts and cannot reap the benefits that come from interacting with art mediums such as music, theatre, or dance. The absence of such activities in one’s life can lead to lost opportunities for neural stimulation and depression in that individuals’ mental state. Listening to and creating music stimulates neural networks in the brain, explicitly activating brain regions responsible for motor actions, emotions, and creativity.

An fMRI scan from the Portland Chamber Orchestra. Displayed under fair use.

The image above depicts different areas in the brain that are activated when listening to music. This increase in activity is a result of neural stimulation. When neurons get stimulated, they need more energy. As a result, the diameters of the blood vessels increase in reaction to the neuron’s signal for more energy. Listening to and creating music can, therefore, activate regions of the brain that are responsible for emotion and memory.

With the ongoing research advancing within the neurotechnological industry, it will not be long before music becomes a truly universal language.


Neurotech@Berkeley

Writers, consultants, engineers, and designers working toward advancing neurotechnology for the benefit of humanity.

Neurotech@Berkeley

Written by

We write on psychology, ethics, neuroscience, and the newest in neural engineering. @UC Berkeley

Neurotech@Berkeley

Writers, consultants, engineers, and designers working toward advancing neurotechnology for the benefit of humanity.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade