Troubleshooting Technical Challenges in 360 and VR Post-Production: Part I

Maria Fernanda Lauret
AJ Contrast
Published in
10 min readJun 26, 2019
Maria Fernanda Lauret, Contrast’s post-production lead, in the office.

Editors go through a LOT of technical issues during the post production phase of 360 and virtual reality films. Interestingly enough, there are many ways we can fix these issues without experiencing major headaches, or prevent them from happening long before the project reaches the editing suite. It’s also pretty helpful for those who film, direct or produce these kinds of experiences to understand and be aware of the difficulties that can arise throughout the post-production process; by knowing what problems to avoid before you start your project, you could potentially speed up your post-production timeline and costs.

From stabilizing shaky shots to tripod removal, we’re outlining the biggest problems we’ve faced when working with 360 footage… and how to avoid and/or fix them.

Stay tuned for next week, when we discuss how to avoid some of the biggest mistakes when colour correcting, mixing spatial audio and adding animation and illustrations to your films.

1. 360 Video Stabilization

PROBLEM:

One of the greatest complaints of those who watch videos in a VR headset is the motion sickness caused by unstabilized/shaky shots. This can definitely compromise the quality of the overall experience, so if you are producing a VR film and plan to add some stunning drone shots, images captured on a rover to generate a bit of movement, or if you’re actually walking and holding the camera up high through a crowd, it is important to make sure your horizon is stable and the movement as smooth as it can be.

SOLUTION:

But, let’s say you end up with a shaky and unstable shot, even though you tried your hardest to make your movements as smooth as possible. Are there ways you can fix that in post? Yep, you’re in luck! Some people use SkyBox Studio V2; others do it while stitching on Autopano Video Pro (which can be very effective and quicker than other methods if your shot is monoscopic). Similarly (but less effective from my perspective), you can use Mistika VR to stabilize your shots. Finally, you can use the After Effects plugin, Mocha VR, which is the best tool to use if you are trying to stabilize stereoscopic footage and/or if you have at least one trackable object, preferably near the horizon line (for instance, a person holding the tripod while surfing or skiing, a tall building or structure, etc). Mocha VR is also very useful for object removal and to get rid of the tripod during moving shots. Check out this Mocha VR tutorial, which gives you a hand at trying out different tracking options.

Now, for my dear friends who shoot 360 videos using drones, make sure the lenses of your camera are not super shaky individually. That is going to be a huge headache to your post-production team and will most likely not look very smooth, despite all of your efforts.

2. Stitching Stereoscopic Footage

PROBLEM:

When it comes to stitching, the first thing anyone working in post-production hopes is that the shooter does a mindful job. It is quite tricky to stitch a 360 shot together when people, or objects, are passing by super close to the camera. When there is not enough overlap between lenses, it’ll be very difficult to get a seamless image. Things can also get a bit more complicated when you are working with stereoscopic 3D footage. Just as a reminder, stereoscopic 360 footage is captured by cameras such as ZCam V1 PRO and Insta360 PRO that imitate the human eyes, which means that they film different angles of the same scene and generate images for the left and right eye. That creates depth of field, while monoscopic videos are flat and the images look the same for both the left and right eye. That being said, any small discrepancy in the seam line of a stereoscopic shot ends up being amplified with the 3D effect, and becomes very noticeable when watching the material in a VR headset. Sometimes, these discrepancies are in different spots per eye, which can cause discomfort.

Monoscopic image from the 360VR doc “From Waste to Taste,” filmed with the GoPro Omni rig.

Monoscopic image from the 360VR doc “From Waste to Taste,” filmed with the GoPro Omni rig.

Stereoscopic image from the 360VR doc “We Shall Have Peace,” filmed with the Samsung Round

Stereoscopic image from the 360VR doc “We Shall Have Peace,” filmed with the Samsung Round

Nathalie Mathe, a VR creator and post-production supervisor based in San Francisco, has been working as an artist and technology expert in film visual effects for 25 years. She brings some of her technical knowledge into virtual reality projects. When it comes to stitching stereoscopic footage, she says: “It’s nearly impossible to get good stereo during stitching and cleaning up when you have objects or people very close (less than 3 feet) to the camera, and other elements in the background are much farther away.”

SOLUTION:

I usually use Mistika VR and Autopano Video Pro to stitch footage captured with various cameras, but in more tedious cases, a lot of professionals in the field end up using Nuke and Cara VR, which can give you a more professional outcome and a neat look. However, depending on your budget, deadline and stitching line complexity, this option can be time consuming and also pricey, especially if it’s a one time project. The temporary licence of Cara VR isn’t cheap (1500 USD/quarter); you will also need to get a license on Nuke (1629 USD/quarter) to use it. If you don’t have a lot of time on your hands and you are not familiar with the software, it is recommended to hire a professional stitcher who can do the stitching for you on Nuke.

When fixing seam lines, stitching programs usually prioritize foreground over background, and vice versa. Although some programs like Mistika VR have improved their algorithms by using optical flow as an enabling tool, the image does not always look perfect. Mathe explained to me that her solution is often “prioritizing what the user needs to focus on in a shot, or stitching for different distances and compositing different versions together, but this is much more time consuming and tricky in stereo.” Hugh Hou is a co-founder of CreatorUp and VR videographer who posts pretty useful tutorials on 360VR post-production regularly. For him, the best “stitching solution is the one where you can move your stitch line based on the scene and the camera movement.” On Mistika VR, he stitches multiple passes with different stereo edge point placement in the exact same 360 take. Then, he uses After Effects to fix any stereo issues. For moving shots, for either stereo or monoscopic footage, he created this tutorial to show the most practical way to use edge point in the new Mistika.

I have also found Mistika VR to be the most helpful, in many cases. Their optical flow is great, although stitching lines might wobble in some cases. That happened to us in a project that is still in development, Still Here. 360 video was one of the mediums included in the experience, and some of the stereoscopic shots were stitched on Mistika VR: the foreground objects looked great, but in one of the shots, the wobbling background had to be fixed carefully through rotoscoping characters when they were walking towards the back of the scene.

3. Tripod Removal:

Problem:

We have experienced many issues with tripod removal, especially from shooters that have just started working in the field (i.e.participants part of our ‘My People: Our Stories’ initiative, where we trained filmmakers and journalists from all over the world on how to film using a 360 camera). However, it’s not only in the field that we’ve experienced issues regarding tripod removal. We’ve also faced several technical glitches in the post production process, caused by linking the footage in Premiere to the tripod removed shots in After Effects; encountering export glitches when adding a PNG tripod mask on Premiere; tracking issues while using Mocha VR to remove a rover or drone from a moving shot, etc. And what about shots where the light continuously changes and shadows of people running past the camera are moving? How do we fix shots that fall into that category? What if the footage is stereoscopic? Here comes some tips for a good looking nadir:

Solution:

When I first started working on tripod removals, my workflow was to replace the footage in Premiere with an After Effects composition. This way, the excerpt of the shot opens up in AE and any modifications you make will reflect on the Premiere sequence. That seemed very convenient in the beginning, since I didn’t have to spend time rendering shots after tripod removal. However, that became a bit of a nightmare when I had to extend shots, for instance, in which I would have to go back to After Effects, find the original compositions, change their length and then finally extend them (which means that your After Effects project needs to be EXTREMELY organized… if you don’t want to go crazy later). Besides, the export would often show a red screen, meaning that some footage was unlinked/missing, or I’d experience some glitches in After Effects while using the cloning tool. Facing these kinds of issues near the delivery deadline was a huge no-no, so I changed my strategy.

Another method that works — -if you have a static shot in which the light or shadow don’t change throughouts — -is to pick a frame within the time-codes of the footage that you are using, remove the tripod in After Effects or Photoshop (a great option if the camera was not placed on a smooth/solid surface) and render it as a PNG mask. If you have a plate shot of the spot where the camera was placed, you can then crop the still and place it on top of the tripod, creating new compositions to tweak and perfect your image.

This option is still convenient to me, because rendering just one frame and creating a mask in Premiere is the quickest choice. However, that has caused some export glitches, for different reasons. When the footage is heavy and you add PNG masks, it is essentially one more layer that Premiere has to process, and that can cause export glitches — especially when I’ve fixed the horizon line on the original shot and applied the same VR Rotate Sphere values to the PNG tripod mask to fit the image. Also, if one clip was shot in a different resolution than the rest of the video, bringing in a PNG mask can cause a black line on the back of the 360 export (where the flat corners of the equirectangular image meet). To avoid these issues, the best thing to do is to bring each original 360 clip into After Effects, remove the tripod and render the time-codes of the section you need in .mov (I sometimes extend the ending for a few seconds, in case I need to extend the shot later, in the final steps of editing).

Another issue we face, when trying to achieve seamless tripod removal, is when shadows or light are changing. In this case, the workflow becomes more tedious. At Contrast, what we do is remove the tripod with the cloning tool, based on the timecodes we need to use in the final edit, and adjust the cloning throughout the scene. To avoid the abrupt change in color of the new, fixed layers you are adding in the middle of the scene, you can select the new layers and create opacity keyframes, so that these layers will gradually and smoothly appear to cover up the messy part. Another option, which can give you better results, is to use Mocha VR on After Effects, which will help you track any light and color changes and replicate them on the parts where the tripod has been removed.

In this car shot, for example, we had a plate shot. However, the light inside the car was changing as the character was driving. What I did was place the still photo on top of the tripod and repaint the photo itself, grabbing the colors from the car seat (video) so that the light would change accordingly.

In stereoscopic footage, at the very bottom and the very top of the footage, the 3D discrepancy between the left and right eye are very minimal (as the zenith and nadir can’t conserve depth information properly). That being said, I am able to use the same tripod removal mask for both eyes and it usually looks smooth.

So, there you have it: we’ve discussed the biggest problems we’ve come across from stabilizing, to stitching and tripod masking. For those of you in post-production, have you found other useful solutions for these issues? Did we miss anything? Drop a comment and let us know. And stay tuned for our next post: we’re focusing on the greatest challenges when colour correcting and mixing spatial audio of a 360VR piece.

--

--