Lessons of Auschwitz: VR-project created with the help of Volumetric capture. Part 3

Phygitalism Inc
PHYGITAL
Published in
7 min readAug 10, 2020

--

On January, 27 we posted a video art project that we had been working on for 2,5 months. The series of articles will be devoted to the concept of our project, both meaning and technologies aspects. We will also show you why you should not fear the unknown.

Part 1 \ Part 2 \ Part 3 \ Part 4

Recording stage

After we were done with the research for the project, we needed to proceed to the next stage — recording. We had only one day in a green screen studio :) After it we could record only the material with painting with the headset.

To make the recording process efficient and productive, we needed to prepare everything beforehand. We drew a list of the required equipment, defined the project namings. During a preliminary meeting we discussed how the process would be carried out, what movements we needed to record, and how to use Tilt Brush to achieve the effect of mixed reality by combining the models of the pictures and the movements.

To record the main part of the children painting in Tilt Brush we needed the following equipment:

  • A VR headset and a controller
  • An Azure Kinect camera
  • A computer
  • An external SSD (it would speed up the process of recording and editing)
  • A chroma key studio or green background
  • Good lighting

Mixed reality video recording (XR)

You can find lots of fascinating videos of people painting in 3D space with virtual reality:

Our goal was to achieve the same result, but with visual effects (VFX) applied to people, and pictures reacting to sound. And of course, all of it should be in 4K resolution.

During our research we found the following articles:

They were not of much use for us, because the authors of the articles ended up with edited videos. Although we, on the contrary, were interested in the 3D format, we still tested out all the methods mentioned in the articles.

One of the first tests of recording with chromakey

Tilt Brush in-app video capture

There is a special mixed reality mode in which the screen is split into four parts (beginning with the upper left corner): scene with painting after applying the refinement mask, an alpha channel mask (a greyscale image), an image from a stationary camera, an image from headset mirror mode.

Screen recording while painting in mixed reality mode

The split screen influenced the video resolution — the final video could be replayed only in resolution that would be equal to half the display resolution. That is why this method did not suit us.

Instead, we decided to record videos using Azure Kinect and painting in Tilt Brush separately. There is a Spectator Mode for these purposes, but you cannot place Azure Kinect and the camera together, as it can be moved only if you are wearing the headset.

An example of shooting in Spectator Mode from a third person

Using Liv and Tilt Brush together

LIV is one of the first apps that comes to mind while working with videos in XR. IT allows you to use a real camera for recording and automatic cropping of chroma key in order to place a person into his/her VR world. Calibration is of paramount importance: all controllers (both real and virtual ones) should be accurately calibrated and synchronised. The result should have been the following:

Photo from LIV website
Our test with TiltBrush + LIV

To our disappointment, we found out that the new update for Tilt Brush could no longer support LIV because of some technical problems that did not exist in the previous versions. We even contacted the LIV developers concerning the problem, but they could not tell us when the bug would be fixed. This YouTube channel contains videos that may be of help in the process of setting and recording.

Using three controllers and an HTC Vive Headset

We had one more variant. HTC has a peculiar feature that allows you to use three controllers: a real camera can be combined with an HTC controller by means of special holders (or tape:) ). We can connect one more controller via USB and set it as a binding/relative source for virtual camera in LIV. This enables us to move the controller together with the real and virtual cameras, making it possible for us to capture dynamic angles in making VR content.

An example of how you can attach the controller to the camera.

Thus, this method was more appealing to us, but little did we know that Azure Kinect and HTC headset were incompatible due to the fact that both of them use infrared light that prevents the gadgets from working properly.

Our solution

It was clear that none of the methods would work for us. Even if everything functioned perfectly well, LIV still would not be able to create a 3D scene (only 2D). So we came to a conclusion that the only variant left was recording the painting process using Azure and imitating painting in Unity itself. To do so, we decided to use .fbx model with special shaders.

So we came to a conclusion that the only variant left was recording the painting process using Azure and imitating painting in Unity itself. To do so, we decided to use .fbx model with special shaders.

Recording pipeline:

  1. We set the Azure camera and the composition according to the plan with all the necessary angles and takes;
  2. We downloaded an already created Tilt Brush scene with picture beforehand. Then we chose an object that the artist was to draw for a certain take. The object was copied;
  3. In the VR scene we marked the place for the Azure camera for each take. For every new take we needed to adjust lighting (we approached the camera wearing the headset and drew an arrow up with the controller). This helped us later, when we needed to understand what angle was used in this or that take in Azure and how we should place the model of a picture;
Scene in Tilt Brush (VR). The arrow shows from which angle we filmed the person, and from which angle it is necessary to substitute the model. The color of the arrow and the picture is the same in douple.

4. A scene with new objects and arrows was saved and exported. It is impossible to export .fbx files from Oculus Quest headset, because it is phone based, and there is a limited number of functions in apps. That is why we used HTC Vive for it, exporting the files from the Tilt Brush folder on the computer.

5. Then we imported the results into Unity 3D and worked with the material there.

Backstage. Drawing recording. The camera is visible in the upper left corner.

What did we learn during our research?

Testing out different methods and variants, we found out about many interesting nuances. Although we have already mentioned some of them, we would like to draw a conclusion:

  • HTC Vive headset cannot be used together with Azure Kinect camera because of the infrared transmitter. That is why we had to go with Oculus Quest headset. It looked better on camera, even though there were some problems with Tilt Brush (large scenes took a lot of time to load and sometimes they froze up)
  • The video was translated to DepthKit which was opened on the computer, so we had an opportunity to look at the scene from different angles and perspectives in real-time mode. While working with Volumetric video it is recommended to use refinement masks after everything is recorded (they serve to cover the green background). If you do not use the refinement mask, DepthKit suggests that you separate the person using the depth data. Turns out that depth of some materials and fabrics cannot be measured accurately. It is especially true for reflective surfaces, so we taped the reflective part of the headset and asked people to wear jeans, t-shirts, or hoodies. Suiting fabrics was the hardest to be perceived by the programme.
  • You can record videos without using chroma key, but in this case you will not be able to apply masks, so you will have to work with only depth data (and it may be slightly off). We were lucky enough to get access to a chroma key studio with good lighting.

The main difficulty that we faced while working on our project was the fact that the methods we tested out contradicted each other. For example, we had some difficulties with the chosen software. Every stage required from us deep thinking and creative imagination to overcome the obstacles that arose on our way. We needed to think through our every step to choose the tools and methods that would work best, and sometimes we even had to change the approach to our project radically.

We came up with our own ways of solving some of the problems and we will tell you about them in the next part.

<previous part \ next part >

Written by Katya Toxic

toxic@phygitalism.com

instagram: katya.toxic

PHYGITALISM

--

--

Phygitalism Inc
PHYGITAL

An international tech company developing Phygital+, a web-based AI product for creators