Creating a 360 Stereo Video — Lessons Learned

Feels Like Studio
7 min readFeb 15, 2017

--

Innovation in film has come full circle, and it can be dizzying. We recently shot a 360 film for Google daydream and gained some valuable insight in the process and we’d like to share some that here.

As part of the release of the Daydream VR headset, Google brought us on to create a 360 video to live within Daydream demos in-stores across the world. The video, a combination 360 live action footage and animated graphics,was created to show what is possible inside the realm of Daydream.

The video concept we designed included a montage of 360 live action, original content footage. All we needed was to decide how to shoot it. As it turns out, that decision took quite a bit of R&D (not that we mind!). We came away with an exciting film, and a host of best practices for creating a 360 film.

Because our video concept included a montage of 360 live action footage, we needed to plan and shoot original content. Below is what we learned during our process.

Step 1: R&D

Shooting 36o Video — Stereo vs Mono

We delved headfirst into our research, beginning with the stereo vs mono debate. After taking in the pros and cons of stereo vs mono — the depth of footage in stereo outweighed the cons associated with

stereo shooting, and so we decided to shoot with the JUMP camera as it is the only 360 rig that shoots stereo footage. One thing we quickly learned is that there is dead space on the tops and bottoms of the footage because the JUMP camera does not include cameras that face towards the sky and ground.

Stereo gives much more depth to the scene.

Step 2: Shooting with the JUMP Camera

If you have never shot with the JUMP Camera before, one fundamental key to keep in mind is to understand the limitations of your equipment before the day of the shoot. The JUMP Camera is made up of 16 Go-Pro Cameras all controlled by the Go-Pro labeled “1”. Camera 1 is the master camera, and if you change any settings on that it will change on all of the cameras. If you press record on the master camera it will enable record on the other 15 cameras.

Testing beforehand

Just like any shoot there are many variables that come into play, time of day, inside vs outside, number of people in scene, how close the action happens to camera, and etc. Testing camera height can help you set up the scene going into the shoot. If the camera is too high or too low you can feel very small or very tall in the VR landscape.

Natural Light

Timelapse of the afternoon to night on the Jump Camera

The JUMP Camera can do a lot, but at the end of the day it is still made up of 16 Go-Pro Cameras. The best times to shoot are early in the morning, around noon when the sun is directly above or sunset. Shooting inside can be good if there are plenty of windows for natural light to shine in. Setting up lighting can be tricky because the camera is, after all, 360 degrees. Time may be a flat circle but the world of VR is not. Depending on the time of day you shoot, the camera’s shadow can be an issue. The lower the sun, the longer the shadow, which can take the viewer out of the experience. Try turning the camera’s shadow into an object to hide it as a part of the scene.

Beware of your surroundings

One aspect that makes a 360 shoot different then other shoots is everything is in sight. You can’t hide “behind the scenes” because there isn’t a “behind the scenes.” This, in combination with the lack of realtime feedback for the JUMP means more detailed setups and shot lists.

If you get too close the stitching starts to break
Here is another example of testing distances and showing the dead zones at the top and bottoms of the frame.

Technology can only do so much

There is no true live feedback out yet for the JUMP Camera. Unlike traditional film shoots, you can’t get a live feed out of the camera. Our best solution for this was to create a rig consisted of four Go-Pro cameras on top of the JUMP Camera. The cameras would record the scene and we would then do a quick stitch on site.. We couldn’t see all the way around but it gave us a sense of what we were shooting. We had two other Go-Pros facing north and south for a live feed.

Having an SSD drive to transfer footage can save you hours because you’re usually transferring around 500 GBs.

There is only so much you can do if the rig malfunctions; if a camera is corrupt the rig won’t work or will indicate which camera is causing an error. It is essential that the camera is always level or the stitching may not come out right.

Google created an app allowing you to upload your footage to their server. They stitch the footage for you with a turnaround time of 1 to 3 days. Google sends back high res footage plus low res proxies to optimize your post workflow.

Step 3: Post Workflow for 360 Video

Workflow: There is no good workflow (yet) ¯\_(ツ)_/¯

At the time we are writing this, no 360 VR workflow is perfect and most likely will require beta software, plugins and custom scripting. In our case, using the JUMP camera removed the need for stitching. We used Premiere for editing and Mettle plugin for live proxy preview in the Oculus Rift as well as light compositing. After Effects was used in a more traditional manner for masking and rotoscopy. Cinema4D and CV-VRcam along with photogrammetry captured on set allowed for stereoscopic vfx compositing over the JUMP cam footage.

One of the challenges of working with Stereo footage is matching up the left and right eye.

Powerful + most recent hardware is critical.

Due to the enormous bitrate and file size of VR video files, having very powerful hardware is critical. Playback without creating smaller resolution proxies isn’t yet doable, but encoding full res files puts the most powerful rigs on their knees.

In addition to raw power, having recent CPU and GPU architectures that are optimized for new codecs and video technologies widely used in 360 VR is essential.

Everything takes more time, plan ahead.

Producing 360 VR content is tedious and even the most mundane task can be more time consuming than anticipated. The amount of time, either for testing or actual production, needs to be identified early on and accounted for. For editing, the need to proxy all the footage adds an extra step, usually not necessary when delivering for a mobile platform. Compositing and post needs to be done on the full res footage. For example: we had to rotoscope a 4 second sequence of 6K*6K footage at 60fps, , which contains more frames and many times more pixels than a traditional full HD video.

To give a sense of scale, the bitrate coming out of 6K stereoscopic JUMP has 2.265 Billion pixels per second, versus 62 Million pixels per seconds for 1080 30fps footage. (6144x6144x60 vs 1920x1080x30). That is a 30 fold increase over traditional fullHD footage, which of course impact a lot workflow and render times.

Simplify your workflow, get rid of unnecessary steps/plugins to avoid crashes and keep project files manageable.

The lack of proper VR optimized tools and raw hardware power require the use of plugin on top of more traditional SFX, compositing and editing workflow. We used a couple software in beta phase, due to being the only one able to manage 6K stereoscopic footage.

Those tools can interfere with our traditional workflows and create conflict or crash. We spent time up front defining what tools we needed and test compatibilities from shooting with the JUMP camera to delivering the final ProRes file to the client.

In Conclusion

Testing out the capabilities and limitations of 360 film with the JUMP rig was an exciting and lesson-filled experience for us. We hope you find our learnings useful, and if you have tips of your own share them here! media@b-reel.com

--

--