What it takes to create a 360 video about the Zodiac Killer
Students talk about editorial and technical considerations that came with producing a 360 video about a serial killer.
This Medium post was originally published here.
By: Katlyn Alo, Anthony Miller, Marta Oliver-Craviotto and Matt Shimura
When we decided to create a 360 video on the Zodiac Killer, we had a few “news pegs” (current events that make a piece timely or relevant). A lead detective on the Zodiac’s case, Dave Toschi, died in January, and we were approaching the 50th anniversary of the Zodiac’s first confirmed murder in December.
But, as this video was created for a class in 360 journalism, we weren’t searching for a medium to match a given story; we were looking for a story that could — should, rather — be told in an immersive experience.
We wanted to tell a story that was spatially interesting. We did not want to waste our audience’s time with 200-plus degrees of space that wasn’t really relevant to the narrative (which is why we believe interviews don’t work in 360). What better example of an environment in which time and space are important than the scene of an unsolved crime?
All told, the process was not easy. For most of us, it was the first time we had created a 360 experience. The learning curve is steep, but we hope that this post will lessen the shellshock for other virtual reality rookies.
360 video needs a transparency statement
360 video is new. And with that newness comes a lack of audience “literacy.” When you see a posterboard ad for Covergirl at the store, you know that image has been altered. You have developed that literacy, that familiarity with the medium that allows you to read between the lines. Yet, even with cosmetic ads, there is controversy over whether common knowledge of altering images is transparent enough.
With 360, we can bet that, without a explicit and bold statement that this has been altered, our audiences will likely not suspect that the environment they are immersed in is not completely true to the real-life version. On top of that, 360 cameras are obscure. People passing likely don’t know they are being filmed, and, even if they did, there is no such thing as walking “out of frame.”
For “Beyond Zodiac,” this was rarely an issue; most of our shots were away from people. But, as we ducked behind bushes and tree stumps and little beach hills to get out of our shot, it occurred to us that, if someone were to pass our camera that didn’t want to be identified, we had no way to warn them or ask for permission. As we see a move toward live video, we should be concerned about what that would look like in 360: no out-of-frame, no post-production, just the world streamed live with no takesies and backsies.
Graphics are dope but also hard to make sometimes
The benefit of 360 video is that you can work in multiple 3D environments. Besides the live action footage we shot, our group wanted to incorporate simple computer graphics in the form of a title sequence and timeline. To do this we used a game engine called Unity in order to create 3D objects and animations that could showcase our historical footage.
We started out by creating a simple scene, adding 3D models and textures from there. In order to convert this footage from 2D to VR Monoscopic or Stereoscopic, we had to implement a Unity Plugin called VR Panorama 360 PRO Renderer. This plugin allowed us to affect frame rate, resolution, and a number of other factors in order to sync the settings of our 360 live action footage to that of our 360 CG footage.
After rendering and encoding, we took our newly converted MP4 file and were able to put it directly into Adobe Premiere, our editing software. With a couple tweaks to aspect ratios and positions, we were able to fit it seamlessly into our timeline.
Post-Production Tips & Use of ImmerJ
It’s best to have an idea of whether you want your video to be monoscopic or stereoscopic prior to working in post-production. It’s an easy enough fix if you want to use both types of video, but the quality of image and changing dimensions may briefly take the user out of the experience. We ended up shooting in monoscopic in our San Francisco locations but stereoscopic everywhere else. Ultimately, we restitched the stereoscopic clips into monoscopic, as we did not have many elements in scenes that would benefit from stereoscopic video.
Similarly, audio is a big consideration when working in 360 video, as you can replicate spatial sound through binaural audio or work with normal, stereo sound. Spatial, binaural audio can be very engaging and take the user even deeper into the experience, but it may not have the same effect for tracks such as narration or special sound effects.
ImmerJ, a tool made available to us by R.B. Brenner, Director of the University of Texas-Austin School of Journalism, was also used to place archival images into two of our scenes. The open source program allows for images, text, video and 3D objects to be placed and manipulated in a 360 clip in real-time. It was perfect for our use because we did not have to create any animations but instead inserted .PNG image files, which was an intuitive and simple process with ImmerJ. The addition of the archival taxi cab photo and sketch of the Zodiac Killer adds context which would otherwise be more difficult to add without this program.
Triple-check your location. And then check it again.
Once upon a time, the Zodiac case was cold enough to close, which was, for both journalists and avid Zodiac conspiracy followers, a dream come true. It meant the case files were public records. However, in the time following the 2007 hit feature-length film, “Zodiac,” those cases were reopened, and suddenly the investigation materials were no longer records we had a right to view.
We would like to give a shout out to all the Zodiac followers and documentarians that took advantage of that window of time and released the investigation files under creative commons. Unlike many murder cases, the Zodiac has an intense following and a wealth of information online. We tried leaned on the official investigation documents to help us sift through fact, fiction and the mushy in-between.
We would also like to give a shout out to the 360 drone image enthusiast, who pulled over to check out our camera (the Insta360 Pro) and told us we were about half a mile from the actual scene of the Zodiac’s crime. He then drove us with all our equipment to the right spot. If the location doesn’t look like the crime scene photos, keep looking. Google doesn’t always have it right.
Don’t forget the tone (especially if no one is there to remind you)
We interviewed zero sources for this piece. It was entirely a re-creation based on artifact and records. Coupled with the tidal waves of conspiracy theory and cultish urban legend narratives, the lack of live sources made it much harder to humanize the piece.
Somehow, the Zodiac killer sometimes felt more like a myth than a man who took people’s lives. We tried to approach the voiceover audio in a way that did not over sensationalize the Zodiac in the way that some of his followers have, but also in a way that did not overlook the lives lost.
“Beyond Zodiac” is so very far from the perfect piece. But in it’s making, our team got a taste of what this medium could mean for the future of media — everything from advertising, to music videos, to the weather report. Moving forward, we’d like to see 360 and VR creators offer gestures of transparency similar to the post we’ve offered here. We believe that the sheer newness of the technology makes for a greener audience, one that we can’t expect to know when the video has been altered or the dialogue scripted.
In all publication types, but especially in journalism, 360 video may prove to be a powerful tool in shattering compassion fatigue and developing empathy. But we don’t get to forfeit transparency just because it would disrupt the experience.