Using Volumetric Video for Good

Arcturus — An XR Studio
4 min readNov 6, 2018

--

Volumetric video provides all the micro-nuance of performance and muscle movement giving the viewer a sense of presence, but what if that performer could interact with the viewer, track their location in space, and talk directly to them?

Well, they can, and we recently had the opportunity to use our technology to convey an incredibly important message for a worthy cause.

Santander Bank, as a part of its marketing and customer outreach effort, launched an initiative to generate awareness around the multiple faces of homelessness. The goal was to show people the breadth of the problem, specifically that many people with respectable jobs don’t have the means to afford housing and are forced to live out of their car. The bank sponsored a walkathon which donated $10 for every step taken by volunteers on their national awareness day to fight homelessness. To encourage people to join the walkathon, they needed an attention-grabbing experience for consumers to empathize with the homeless and take action. Arnold, the bank’s ad agency, imagined a 5-minute volumetric video capture of a nurse living out of her car observed in life-scale augmented reality.

We know that using life scale augmented reality combined with a volumetric performer creates a direct connection with the viewer and can inspire empathy and awareness.

The experience starts with the appearance of a station wagon and a woman, Jen, is sleeping inside. Jen wakes up, gets dressed for the day and heads to work. She returns later in the day to find that her car has been broken into and her belongings strewn about. Distraught, she cleans up and heads out for the evening. Returning to the car just as her mother calls; she steps away from the car so as not to reveal her situation. After the call, she settles into the trunk to sleep for the night. In the final scene, we see her return from work to find her car booted. This was the last straw, she breaks down and turns to us for help.

Seemingly, a relatively simple experience that draws focus to Jen’s situation, but under the surface there were several technical challenges:

  • Unable to put a full scale car on the volumetric capture stage
  • Distribution on mobile devices
  • Performer focused on viewer (dynamic retargeting)

These challenges are solved by our volumetric video post production platform, HoloSuite.

Because volumetric video reconstruction does not work if the actor is occluded by physical objects, we could not put a full size station wagon on the capture stage. Our car was going to have to be CG while the performer was captured volumetrically. We limited the objects on the stage to the minimum that Jen must interact with; the seat, the door handles, and the trunk floor. We precisely measured the location of the door handle so that Jen’s hand could be accurately aligned with the car however, props shift when handled so we could not ensure that the position of her hands matched the position of the door in the final experience. HoloSuite provides tools to refine a performance and retarget the limbs to match a new position. Once Jen’s capture was in the system, we aligned her performance with the CG car and used inverse kinematics to alter the position of her reach to match the position of the CG door handle.

Distribution of volumetric video can be a challenge. A hi-res capture can be in the order of multiple gigabytes per minute and this experience was no exception, the hi-res capture came out over 25 gigabytes in its raw form. We know that a download of that size was going to reduce the number of potential users. To solve this, we designed our volumetric file format to include adaptive compression, and with small tuning of the settings we were able to reduce the entire size of the experience to 1.02 gigs while still maintaining a high enough visual fidelity that Jen’s tears pouring down her cheeks are visible.

The moment when Jen asks for your help is crucial to the experience; the viewer needs to feel like they are in an intimate conversation with Jen and it is Jen making eye contact that creates this connection. With HoloSuite’s rigging system we can specify “look-at points” in a scene and if a volumetric performer is looking within a defined angle of the look-at point, we dynamically adjust the angle of the capture’s head and neck so that she is looking directly at the look-at point (dynamic retargeting). For this experience, we treat the phone’s position in the world as a dynamic look-at target. The viewer does not know that retargeting is happening but they perceive that Jen is focused on them and asking them for their help, creating a feeling of personal connection.

We were very proud to lend our expertise to such an important cause as well as feature some of our technology. If you want to find out more about the “In Someone Else’s Shoes” experience or HoloSuite, please drop us a line at contact@arcturus.studio.

The volumetric video was produced using the 4DViews capture stage at CMII at Georgia State University.

This project was an amazing collaboration between Santander, Arnold, Arcturus, Bravo Media, GSU CMII, Media Monks and Shaw Walters from Tin Drum.

--

--

Arcturus — An XR Studio

HoloSuite is our volumetric video post production technology that powers AR, VR, and 2D media. www.arcturus.studio