Films that Watch you While you Watch Them: A brief history of interactive movies
Most films are fixed — they are not changed by the audience during the film, and they in fact do not change once they are distributed. Interactive cinema allows the audience to influence or change elements of the film, so their experience can be different to others’. Before the wide availability of digital technology, this was done manually. Nielson and other data tracking/rating companies have analyzed film and television reactions for decades but they have not used it to interact in real-time with the media.
The film ‘Kinoautomat’ by Radúz Činčera in 1967 and other early interactive films involve the audience consciously and explicitly selecting which narrative route to take. ‘Kinoautomat’ involved a moderator who would appear in front of the screen nine times during the film showing. The moderator asks the audience which of two choices they want to be followed for the next scene and there is a vote. Then the next scene is shown.
The increasing availability of digital technology led to other solutions as found in the 1992 short film ‘I’m Your Man’ by Bob Bejan. Some mainstream cinemas were actually fitted with controllers for the film to allow people to vote on the main character’s decisions. However the approach did not take off.
A more modern approach to interactive cinema can be found with those utilizing online video. ‘The Outbreak’ is an interactive Zombie film that is viewed online in a browser. The user can click to select actions at certain plot points. The area of interactive audiovisual entertainment and games also has obvious overlaps with video games and in particular with the history of laser disk games and some innovative highlights like Half-Life and Dear Esther.
The above examples, and most interactive cinema, has involved the audience consciously selecting film behavior. This can have the effect of reducing the immersion in the story. Pia Tikka in 2006 proposed the concept of enactive cinema where the audience do not consciously choose story directions. Tikka used heart-rate, breathing and movements of the audience to change the experience of an installation called ‘Obsession’. This work also proposed a framework for basing various dimensions of the cinema experience on audience non-conscious inputs. Tikka and her collaborators have investigated the relationship between enactive cinema and neuroscience.
Tore Vesterby in 2006 used eye gaze to control the direction of a two minute video clip. Another pioneer in this non-consciously controlled form of story-telling has been Marc Cavazza and his team who used eye gaze in 2010 and emotional neurofeedback in 2014 in narratives. A key element in Cavazza’s work is the use of “character” AIs who inhabit the narrative space and have internal states and interactions. The downside with eye gaze is that the technology is still relatively costly and physically large, compared to bio-signals, a technology whose cost and size has been reducing more quickly, and which is now being incorporated into — for example — wearable items.
Joep Kierkels in 2008 attempted to detect audience interest during movie scenes using various bio-signals but could not quantify the precise nature of ‘interest’. In 2012 Stephen Gilroy also attempted to address this measuring of interest, in a simple computer-generated graphical drama using single viewers at a workstation. Thierry Castermans in 2012 measured audience bio-signals while they sat in a cinema to see if their emotional reactions could be detected. The results indicated that such detection was possible.
In 2011, Filmtrip premiered the short film “Unsound” at SXSW which monitored audience perspiration to adjust its soundtrack but not its story. The first film showing using multiple biosensors to adjust its narrative for a public cinema audience was in 2013 — the short film “Many Worlds”. It monitored brainwaves, heart rate, perspiration and muscle tension for four audience members.
The film was about a human version of a Schrodinger’s Cat experiment and had four possible scripts, with four possible endings. All four scripts were filmed, and then bio-signals are monitored from a sample of the audience live during the screening; a computer used this data to select live which version of the film is shown at any moment, depending on how bored or interested the audience members were at the time.
Another key element of “Many Worlds” was that the narrative of the film, was metaphorically linked to its viewing technology. The film drama came from the quantum phenomenon of how the observer affects the observed. The narrative-adjusting technology meant that the actual film observers — the audience- where affecting the film: turning it into a metaphor for a two-box Schrodinger’s Cat experiment (as shown in the image above).
This brings us to the most famous interactive movie: Bandersnatch, from Netflix in 2018. Bandersnatch requires conscious interactivity — you have to press buttons on a remote — which breaks immersion. However this is overcome to some degree in two ways: (i) if the viewer does not select a route the film selects one for you; but more importantly (ii) in a similar way to ‘Many Worlds’: the technology behind Bandersnatch becomes part of the story (you have to watch it to understand that point…)
In the end however, it is the power of both eye-gaze and bio-sensor approaches that hold the future of interactive cinema. Not only can they maintain audiences’ immersion, but they might potentially increase it by reactively manipulating plot, editing, or adjusting soundtrack elements in response to the audience dynamically.