Broken Night: A Tribeca Film Festival Debut, Brought to you by A-Frame

The only fingerprints on the gun are hers. Now she has to convince investigators she didn’t fire it at her husband. The main character of Broken Night spends a frantic evening trying to sift through her vivid and terrifying memories of a traumatic incident to figure out the chain of events that led to her interrogation.

Each memory is an interactive VR encounter. The audience can pick which memories to follow as they sift through the remnants of a shattered evening — and shattered lives — to a startling conclusion. Broken Night premiered to critics’ acclaim at the Tribeca Film Festival in April of 2017 and made its international debut at the Cannes Film Festival in May. It was in many ways the model for the upcoming generation of scrappy, low-budget, but high quality virtual reality films.

Eko, a New York-based interactive media technology company, created the unique VR experience using A-Frame, which is supported by Mozilla. A-Frame is a framework for building virtual reality experiences. This simplifies online VR creation for anyone with a basic knowledge of HTML.

Every second of the suspenseful 9-minute film takes the viewer on an intense, immersive journey where they are alone with the characters in the movie. In the film life imitated art with actors and real-life married couple Emily Mortimer and Alessandro Nivola, playing spouses on screen.

Eko co-founder Tal Zubalsky explained how his team experimented with different VR production methods, then turned to their developers for advice. After consulting with lead developer Opher Vishnia, the team decided to use A-Frame to construct the project because of its ability to create immersive experiences.

“We knew we wanted something we could do a lot of fast iterations on. I started playing with A-Frame, then took the leap and started integrating it with our own platform. This was really exciting because we could get the UI (user interface) up and running much faster,” says Vishnia. It took the crew nearly year to conceptualize, develop and produce the project, about three months of which dedicated to technical implementation.

He elaborated, “…we were getting new video assets in and testing UI while we were doing the project, so it was evolving as we went since it was an experiment for us to understand what UI works best.”

As Mortimer told The Hollywood Reporter, “There are always many different versions of any story.” Mortimer, who also served as an executive producer, shared with THR, “With conventional filmmaking, the writer, the director and the performer all have to plump for one version and make that choice for the audience. But with VR, the audience is the one doing the choosing — they’re much more active in the experience.”

Making sure viewers experienced seamless transitions as their choices unfold in the film ended up being one of the biggest challenges for Eko. The production company shot the same scene multiple times to nail how each character experienced the scene from different perspectives. Next came the challenge of transferring terabytes of data created in that process. They had to make sure the video player could handle video and audio while maintaining high frame rates to provide a smooth experience and keep the viewer from feeling seasick (a common problem with live-action VR content). Eko developers worked in both New York and Tel Aviv which required creating a new process on how to work. The solution? Identical setups and devices so that developers in both locations were seeing the same scene at the same time.

Along the way, Mozilla was there to help. Vishnia regularly dropped questions into Mozilla’s A-Frame Slack channel. The feedback he received helped the Eko team resolve multiple issues while developing the film. “A lot of the questions and discussions flowing there were helpful. We ended up writing an open source component for the film- A-Frame Spritesheet component- and also submitted it to the A-Frame registry,” he added. That component allows developers to control animations from pre-rendered frames.

The growing pains weren’t limited to the developer side. It was unlike any other project producer Rose Seyfried had ever worked on before and it meant creating a whole new way to manage workflow. “There were a lot of moving components. The project was an experiment for us to try and understand how live-action interactive VR can work best.” High profile actors on set had to get used to the large VR style camera next to them and the complex staging that those cameras and VR filming required because VR cameras film in every direction.

“It was amazing to see such pros in action. For Emily (Mortimer), it was weird because on a regular set you’re used to tons of people, equipment, and lights. But in this case, everyone has to leave the room except for a giant camera and she has to perform” explained co-director Tal Zubalsky.

“The vast majority of live action VR is very passive… you observe the scene without participating. What we bring to the table is actual interactivity, not only taking you into a different world using VR, but giving you the power to affect things in this world,” noted Zubalsky.

The Eko team feels strongly that WebVR is going to be a necessary building block to immersive storytelling. Lead developer Vishnia likened it to trying out a paintbrush or a new tool. Once you get it, you’ve got to learn what can you paint with it.

Says Zubalsky, “Vendor and browser participation in paramount, but other browsers are lagging behind Mozilla. We are waiting for WebVR to become a true standard, at that point so much more content will be created, and definitely so much more content will be consumed.”

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.