Troubleshooting Technical Challenges in 360 and VR Post-Production: PART II

Maria Fernanda Lauret
AJ Contrast
Published in
6 min readJul 9, 2019
Maria Fernanda Lauret, our post-production lead, frustrated after yet another Premiere crash.

Editors go through a LOT of technical issues during the post production phase of 360 and virtual reality films. Interestingly enough, there are many ways we can fix these issues without experiencing major headaches, or prevent them from happening long before the project reaches the editing suite. It’s also pretty helpful for those who film, direct or produce these kinds of experiences to understand and be aware of the difficulties that can arise throughout the post-production process; by knowing what problems to avoid before you start your project, you could potentially speed up your post-production timeline and costs.

Last week, we wrote about how to stabilize and stitch shots and remove tripods with the least amount of hiccups possible. This week, we’re covering the biggest problems we’ve faced with color correction and sound… and how to avoid and/or fix them.

1. Color Correction and Dealing With Low Light Footage:

PROBLEM:

It is quite challenging to film low light scenes, even while using high-end 360 cameras. Very often, you get super grainy footage with a lot of noise and quality loss in dark spots. There are a few useful noise reduction tools and plugins that filter frames in a video sequence to differentiate noise from detail, in order to finally smooth out your footage on a frame-by-frame level and get rid of the digital graining aspect of low-light shots. However, you should be very careful when using these tools in your shot, because that can compromise shapes and details depending on how much of it you apply, especially when there are people in the scene. If you use too much of this effect, you might end up blurring out the tips of people’s fingers and facial traces, for example.

SOLUTION:

The most important thing to remember when filming in 360 is that it is always better to shoot in flat, or RAW, to have more flexibility to color grade your piece in post production. However, that is not always possible. In our films, Yemen’s Skies of Terror and Determined for Hope, both filmed by Yemeni journalists in Yemen, there was no way to get our journalists equipped with high-end virtual reality cameras. The quickest way to get the footage from them was through using a consumer camera, which gives you pretty saturated (and often overexposed) exterior shots and underexposed interior shots.

It’s possible to get away with color correcting tools from Premiere, for instance, but there are also other platforms that can help you achieve better results. I have seen some people use SCRATCH VR/Assimilate to color grade 360 footage, while others use Blackmagic’s DaVinci resolve, which has received mixed reviews in the industry. To remove subtle digital noise in a 360 scene, you can always apply a minimal amount of VR denoise effect (value between 0.1 to 0.5), as well as a bit of VR sharpen (value between 2 to 15) in Premiere. But for best results, we use the Neat Video, which gives you more effective tools and more control of your footage than the VR denoise effect.

To clear up noise from both high quality footage and shots captured by consumer grade cameras, another tutorial created by Hugh Hou shows you how to prevent — and fix — the noisy aspect of low light footage from capture to render, as well as how digital compression plays a role in the quality of the final product in this tutorial.

Rough stitched shot from “Oil In Our Creeks,” filmed on the Nokia Ozo.
Final shot, fine stitched, color corrected and Neat denoiser applied

2. Spherical Sound:

Problem:

We should keep in mind all of the restrictions that filming a documentary entails. In immersive documentaries, for instance, often times you don’t have the luxury to repeat more than one 360 take when you are trying to quickly capture something that is already happening. When ambisonic audio is part of your production plan, and your camera doesn’t have a built-in spatial audio recorder, that challenge becomes even bigger.

I talked to spatial audio designer Daniel Sasso about the greatest technical challenges that sound designers face, and he emphasized that: “Many times, the 360 sound ends up not being thought through for this format beforehand, which means that most times, we need to work with sound captured as standard mono or stereo.” That means that if directional sound is an essential part of your story, and you didn’t manage to capture spatial audio for the scenes you end up using, you will have to improvise in post-production and kind of “fake” the spatialized sound. That is what happened while filming one of our pieces, “The Curse of Palm Oil,” shot on the GoPro Omni rig in the forests of Malaysia. Some of the shots were taken very quickly either because they were in restricted areas, or because the crew found something interesting to film and had only a couple of minutes to set up the camera in the best spot possible.

Besides, another big challenge with spatial audio is creating a piece that will be distributed on different platforms. Depending on where you are showcasing or publishing your video, the required number of channels and final specs are different, so it is best if you already know what these platforms are before starting your edit. There are specific requirements for Youtube, Oculus, HTC VIVE and Facebook, for instance, and adjusting your project after to add one more distributing platform can be a pain. The VR editor Lorraine Cardoza, who has experience editing documentaries in the 360 format, told us that when it comes to technical challenges of VR post, “I personally have dealt with many different technical troubles from ingestion to delivery, but the biggest challenge has always been delivery and accommodations for different platforms for both sound and video.”

Solution:

One great thing that made “The Curse of Palm Oil” more authentic and rich in sound design details was the fact that our producers on this project made sure to record separately distinctive and interesting sounds of the forest coming from very specific places: animals and birds, bulldozers, leaves, etc.

Drew Ambrose and Sarah Sin recording sounds of a money in Malaysia

These recordings, and additional free sounds I found on FreeSound, allowed me to add sound effects to boost the spatialized sounds, especially on the shots that only had stereo audio from the camera itself. For some of these shots, however, adding sound effects captured separately wouldn’t help, so I had to what I did was to multiply the stereo audio files in different layers on Reaper, and apply Facebook360 plugin to emphasizing the direction of the action on one side more than others.

Regarding the different number of channels and render settings for distribution in different platforms: when I mixed audio for Contrast’s first documentary “I Am Rohingya,” for instance, my whole workflow was focused on publishing the content on Steam VR/HTC Vive using the ATK plugins (similar to Facebook 360, which helps you design spatial audio on Reaper and other audio editing programs) and I had trouble adjusting it to other platforms. Using Facebook 360 on Reaper really facilitated my workflow, because it allows me to encode the same audio mix for Youtube, Oculus and Facebook using FB360 Encoder. Lorraine also exemplifies the struggle with her own past experiences: “I’ve worked on a project where the sound had an ambisonic mix, but Premiere would mix up its channels so I worked with the sound mixer to find a command using FFmpeg to manipulate it to keep the structure of the audio layers and marry the video. Along with that, the videos in stereo 4K codec needed to be an H264 which also was not supported by Premiere. So using FFmpeg again, I used commands and forced the codec until it was playable online and in the headset.”

Stay tuned for next week — we’re covering animation and sound!

--

--