The Cerebral World of Brainwave Storytelling

Image credit: Ana Kolasa

We all have that one friend who talks to the movies. The one who begs the curious teenager on screen not to open that creaky door to the basement, who cheers on the hero as he reaches out a hand to save his plucky companion from falling into an endless abyss, or mutters, “Kiss her, already!” to the bashful couple with unresolved romantic tension.

There’s good news for that friend — EEG brain sensor technology could allow for movies that finally talk back.

EEG sensors are lightweight headsets that read electrical signals in the brain in real-time, showing what areas are lighting up when, and translating it into data.

In developing the story for our EEG-powered brain training app, we took a dive into the growing world of artists, scientists and storytellers who are experimenting with narrative using brainwave sensors.

Richard Ramchurn’s excursions into creating interactive films using viewers’ brainwaves are probably the closest to the idea of a movie changing when you yell at the screen, although of course it’s not that simple (or disruptive).

The University of Nottingham PhD student’s most recent film, The Moment, debuted in June of 2018. It’s a 27-minute sci-fi dystopia story that shifts scenes, shots, and even music based on one audience member’s brainwaves, specifically a band based on attentiveness.

Ramchurn describes an exchange, a two-way flow between the controlling viewer and the film. The viewer’s emotions affect what’s shown on screen, and the film (hopefully) affects the viewer’s emotions.

This is as direct a form of interaction with a traditional form of storytelling as we’ve seen using EEG, but it’s still in its very early stages. The film effectively flips back and forth between two different storylines depending on the viewer’s attention, a simple way of putting brainwaves to work to affect narrative, but perhaps one that’s not as emotionally driven as Ramchurn intends.

The exact connection between what the controlling viewer is feeling and how the story changes is hazy, a common problem with EEG storytelling that purports to work off emotion.

Researchers in Tel Aviv and Middlesbrough, UK are testing their own form of EEG technology that can be used to let audience members directly affect narrative through empathy and positive feelings towards the protagonist.

In a 2013 preliminary study, subjects showed a 52.7% rate of success in using their technology to change a branching narrative from a negative to a positive conclusion by attempting to provide “mental support” to the protagonist in distress.

This research points towards a design of brain sensor that facilitates more organic, emotional interaction with a narrative rather than Ramchurn’s attention-based model. It directly encourages emotional investment in the story by putting the viewer’s empathy and desire for the protagonist to succeed into action in driving the narrative forward.

However, the researchers note the difficulty in controlling emotion, and the frustration on the part of subjects who couldn’t focus their brainwaves to affect the narrative.

Ramchurn’s method was designed to provide minimal disruption to the flow of the narrative and allow the story to unfold more or less like a traditional film. In the Tel Aviv/Middlesbrough experiments, the narrative comes to a grinding halt when viewers fail to stop the demise of the protagonist with their thoughts. Knowing firsthand how difficult it can be to learn to produce a desired brainwave pattern through an EEG, it’s a notable barrier to engaging with the story.

Although it’s the first instinct for cinema fans like me, trying to marry EEG interaction and traditionally narrative forms such as film may only prove to limit the potential of the technology. Many avant-garde artists have put EEG to use in creating installations and performance art pieces.

Andrew John Milne’s Media for Solo Performer, used his own brainwaves to navigate Google Maps locations from the lives of him and his father as their pre-recorded narration played in a “non-linear documentary”.

Luciana Haill and Lia Chavez both created projects in which viewers’ EEG input controls sound and light displays, while artists such as Marina Abramovic explored the synchronization of two people’s brainwaves in her Mutual Wave Machine.

None of these projects use traditional narrative structure or directly use brainwaves to shape a story’s “plot,” but they point towards alternative methods of EEG storytelling.

Brainwave-powered light and sound can contribute to the tone and atmosphere of a story, which often plays a large part in non-linear or abstract storytelling. The journey of two people synchronizing their brainwaves could be an experiential narrative within itself.

EEG storytelling is still an experimental form of narrative. These existing projects provide just as many questions as answers when we look at the possibilities of the medium. What sort of brainwaves should drive the story? To what extent should the user affect the narrative and how easily? Are traditional narrative forms the best use for this technology?

The more we can learn from the creative community of EEG innovators, the better we can answer these questions for ourselves. What’s clear is that if you want your thoughts to control a narrative, you’re better off putting on an EEG sensor than shouting at the TV screen.

--

--