Mixed Reality Studios — A primer for producers: Part 4

Nils Porrmann
frame:work
Published in
6 min readNov 9, 2020

In 2020 Mixed Reality has become a much discussed topic in media production. This article is part of a series and taken from a bigger release at dandelion + burdock and inspired by everybody’s efforts at frame:work

Current Challenges to XR Adoption

When XR is under consideration for a production, one of the major challenges for its successful use is the familiarity with all of the components that make up an XR Stage. Media servers have been used in live events for 20 years, while LED screens started to dominate scenic design in the last 10 years. Camera tracking for AR overlays has been in use since the mid 90’s, debuting in the sports market.
However, mixing these tools creates an entirely new production paradigm, one that is still evolving for both cost and workflow. While developing the process for how we efficiently use these tools together, consider every XR project part of a grander experiment to change the future of live events and broadcast production. Don’t let familiarity with the individual components of XR lull a production team into a false sense of understanding of how a production should be run and what it will cost.

Most importantly, consider your XR team a strategic production collaborator. Any meeting you would typically invite your set or lighting designer to should now include your XR or Screens Producer. If you have not worked with a media operations team (your media server team) in the past that includes a Screens Producer, it’s possible the team you are considering for your XR needs is not as experienced as they suggest. A good XR team should include a producer, programmer, content coordinator/workflow specialist and likely two or three engineers. In most cases, this represents expertise from different companies that specialize different aspects of the XR production pipeline, though teams are evolving to deliver complete coverage. At minimum, your point of contact for an XR project should sub-contract or point you to the desired partners.

Along with a larger XR Operations team, the time consideration is significant. Best practice currently suggests two days of calibration time per tracked camera. This time might be referred to as ‘dark time,’ but it is a bit more involved than the ‘dark time’ needed for projector alignment. Camera calibration requires a clear stage, free of any work or rehearsal activity, and it requires the camera operators and all utilities and engineers for those cameras to participate as well as the XR and LED Screen teams.

Suggestion for an XR organisational chart (highlights show XR inplications)

Other time considerations should be spread out across the production schedule. The software that drives XR is constantly adapting and improving, resulting in the occasional bug or stopdown to resolve issues. Creating space in the schedule to ride out these events will protect the project and allow the teams involved to find solutions or work-arounds to keep the production moving forward.

In the past, you may have not even known when there was an issue with media server playback. These teams are excellent at quickly adapting and finding solutions to the various problems that come up with these tools. The significant change to these teams with XR is when there is an issue, there is no way to hide the problem anymore. Any fault will cause the entire production to pause while a solution is found. The camera team may be needed and the stage may need to be cleared to fix a calibration issue, so camera rehearsal is not able to continue while these problems get sorted.

The XR process will continue to improve and these potential faults will decrease, but for the time being it’s best to prepare what happens if XR is not working.

If part of your live event is XR and the rest uses the LED Volume with traditional content playback, consider running XR on an isolated system from the rest of the show’s playback needs. Also consider how you might rethink XR creative for traditional playback and have those files prepared. The XR team should work closely with the content production team advising on strategies and creative choices that feature what works best in the current software version, while hiding the flaws or issues.

As a final point of consideration and something unfamiliar to the common interaction with a screens operations team prior to XR, the XR team should work closely with the camera director, choreographers and stage managers on blocking. Good communication and shot planning between the XR team and the Director will simplify and inform many choices that otherwise might critically complicate an XR production. This is why an XR producer should be involved in production meetings as early as possible.

For example, the shooting venue is really no longer the physical space of the LED stage, but part of the 3D world supplied by the content creators. Pulling wide on a shot is now a budget determination of how much of that 3D world will get built, rather than part of the scenic, lighting, or audience that would fill in that negative space. And that’s for one tracked camera.

When two or more cameras are tracked, the production team must carefully plan camera shots. The LED Wall can only display the correct camera background for the active camera, or else the camera field of view, and related virtual camera frustums, must not overlap. For some, the barrier to successful XR is not cost or time, but the resistance to a new standard of XR team participation to camera blocking. This collaboration is essential and will help optimize costs and greatly improve the on-site process.

Without question, this is an exciting time to be working in XR. There are fantastic examples of successful use of these tools. However careful discussion, schedule and budget review make this process much smoother. The impacts to budget, schedule and team communication cannot be overstated. We are re-inventing the production process.

(In)visible Crowds

XR production tools and adoption are shaping up, yet the technology comes with a cost. The production community is currently focusing attention on making effects in real time and conforming them to user friendly formats. This leads to a strong auteurism in the medium. The inherent dependence on carefully planned points of view, and the inevitable limitations of the virtual space’s design and rendering, manifest this level of control. While the hard and software tools strive to offer greater degrees of freedom in the creative process, the results are hard to distinguish from film and TV products.

What is the essential added value with real time XR events? In the production it’s obvious; less post production, infinite virtual worlds, low set building cost, less travel, creative freedom in the process…, yet the information conveyed will end up in a recording for viewing on demand without exception.

Audiences can participate in XR shows in real time, but do they gain a visual benefit assuming the output is a typical 2D canvas? The viewer is oblivious to the complex process. The live events industry should therefore discuss the just-in-time paradigm and how the new technologies will feature in productions once crowds come back to venues.

Right now spectators use personal devices or watch from within a private setting, sharing an experience with family or very few friends. This provides design considerations for a form of broadcasting into intimate, small and cosy spaces.

Once XR productions narrowcast into game and VR style formats, the audience bubble inevitably shrinks to the single viewer. Interactivity and freedom of movement increase, while the virtual space overwrites the real.

At this junction volumetric capture supports bringing individuals into shared virtual and XR spaces. The rules of presentation can be broken but the common denominator is the designed stage.

Is there a space that could be defined as hybrid XR stages in live event streaming?

A stage that could be more like the traditions from theater or shows. Where we look to build the environment not only in the virtual but a stage that gives room for both the physical lighting, props, atmospherics which gets extended by XR. Not as a full immersive stage but as windows into the virtual. A stage that still utilizes camera tracking to align the virtual with the physical. So to extend the set and streamed volumetric performance. We are used to creating looks that only elude the audience to think that something is. So why is the format of streaming changing it?

--

--