Mixed Reality Studios — A primer for producers: Part 1

Nils Porrmann
frame:work
Published in
5 min readOct 19, 2020

In 2020 Mixed Reality has become a much discussed topic in media production. This article is part of a series and taken from a bigger release at dandelion + burdock and inspired by everybody’s efforts at frame:work

The XR Set

In Mixed Reality live performers and props are filmed inside an illuminated setup of LED walls, which are mapped with a corresponding virtual environment. This environment is rendered in real time to match the perspective of a tracked camera. The actual stage and performers are lit corresponding to the virtual scenery. In this way real and virtual elements blend into a single comprehensive mise-en-scene.

XR Space

Different levels of immersion of virtual productions have different names. An XR Space is a somewhat encompassing wall, while XR Stages have an added walk-on floor, and XR volumes almost entirely cocoon the production.

XR stage in diamond (downstage direction: line red)

In XR spaces and stages, props and performers need to be conventionally lit to achieve immersion.

A volume, on the other hand, can be considered to create its own light space, a condition in which a lighting rig is more auxiliary.

An example XR Volume with ceiling, floor and encompassing wall

Typically XR stages are laid out as three sided cubes (diamond configuration) or curved screens atop a walk-on LED floor. There are variations and hybrids of these concave spaces. All of them typically open to align the floor’s diagonal as the main camera — or viewing axis (see diamond illustration above) analogue to the upstage-downstage direction. The arrangement encompasses screens, lights, cameras, performers and props.

In contrast to green screening and post production, XR workflows can achieve a true interaction of analogue and virtual elements in real time. Instead of compositing shots after the fact, all effects are presented to one camera and video stream.

Green screen set

Whatever type of XR set, it is a part of the real world production space. The virtual reality is a conceptual overlay to the real space. Their amalgamation is then made visible through the camera on set.

To plan the XR design, lighting and in camera result, it is useful to simulate the LED set, real props and performers in 3D. The real and virtual components of the set and scenery are becoming one creative domain.

previz of a virtual 3d set superimposed to the real set.

Merging it all

A second domain is the compositing stack that combines into the intended overall shot. To further define this compositing domain we will explore XR layering and the related technical aspects:

3d space from the camera’s point of view with a composite of all layers
disguise composite
transmission mix
frontplate: objects can be perceived anywhere within the depth of the composition+shot, here the middle ground
disguise UI

The bottom layer of the compositing stack is a “backplate”. The backplate is everything that is presented to the LED walls of the XR stage. In order to render the image into LED, the camera needs to be calibrated in spatial relation to the real set. At any time the camera’s position, direction, focus and zoom are tracked and applied to a virtual camera. This virtual twin is rendering a virtual scene, comparable to the eyepoint of a player in a first person game. The resulting image of the virtual world is passed to the LED walls.

The XR stage can display as much of the virtual camera view as is respectively allowed by the configuration of the LED walls. (image- pixels falling on and off the set, wide shot close shot).

Since it is likely that a wide camera shot will pick up more of the virtual scene than is covered by the screens, the system can render a “set extension” layer. This is the part of the backplate that can physically not be displayed in the LEDs. Typically the set extension layer is composed above the backplate and ideally contains a feather and choke matte to blend well with the lower layer.

The real world performers and props are picked up by the camera along with the backplate. Therefore the real world scenery is presented correctly as long as it is within the LED wall’s bounds. Once a scene element is outside these bounds, while the camera has a wider view, any element is overlaid by the set extension layer, and thus occluded.

The backplate, camera pickup and set extension layers form the base of the XR composite.

The top XR-layer in the stack is the frontplate. The front-plate is using the same camera information as the backplate. The cameras in virtual space are effectively alike, but differ in their depth rendering, or pick up virtual scenery by render tags. These tags can encode “behind” or “in-front” information per virtual scene item, and can be dynamically changed or keyframed. Although it is a simplification, it helps to consider that the front-plate camera renders from the camera focus plane towards the lens.

If a system is overlaying a tracked virtual render on top of a full frame camera capture only, it is commonly called augmented reality (AR). A frontplate layer does however not necessarily need to be in front of a presenter or prop in the perceived depth of a shot.

Additional layers can be brought into the stack as screen, textures, frames and similar types within the virtual environment. This can be presentations, graphic, videos as well as live streams. These layers can appear in front- and back-plate, a real prop within the XR- stage or as a graphic overlay such as lower thirds, tickers and logos.

composition stack

--

--