Barriers to XR Adoption in Virtual Production — Part 1
And how to get past them — A guide for users
This series is an extension of an article by Nils Porrmann, Thomas Kother and myself posted at dandelion + burdock
You can read the full article at the frame:work website
While live event production has largely been shuttered due to the coronavirus pandemic, one opportunity to produce performance content has seen a rapid growth in development over the last 8 months, and that is XR. The term ‘XR,’ short for miXed Reality, is used to describe a host of technologies that allow for the deployment of real-time generated images that are manipulated based on the position of the viewer, which is the camera. Camera POV makes these images seem to coexist naturally with the performers, allowing for sets, lights and even other performers to be placed virtually in the scene.
While these tools were in development prior to the pandemic for their ability to create natural lighting and reflections, they are seeing a rush to use as they solve many Covid compliance issues that make current filmmaking and television production exceedingly complex. However, while XR has been seen as having the potential to save live events, tv and film production, the rate of adoption has been a challenge.
Of the factors that impact the ability to use XR successfully, budget and time are primary limitations. But there are other factors that include skills acquisition, team structure, production planning and communication that add further complexity. Some of these issues existed prior to XR Production and have only been exacerbated by the rush to adapt to this very different working style. In this article we will break down issues and strategies to plan for future discussion with clients about successful use of XR.
Where we’ve come from
A bit of history for perspective
Terms like XR & Virtual Production have become industry buzzwords which unfortunately means they are easily interchanged and misapplied. XR is a subset of Virtual Production technologies and applies specifically to instances of combining AR or Scenic Extension with background replacement tools. Background replacement can be the result of live green screen compositing or real-time video output to LED walls and is the backbone of Virtual Production. When the background is rendered and delivered as the scene is shot and overlaid with foreground elements, and all imagery driven by camera POV, you have XR.
XR is the result of decades worth of technology coming together into a new production paradigm. AR (Augmented Reality) has been used in sports for over twenty years going back to the introduction of the 10 yard line overlay. 3D tools have been combined into video playback and media servers to support the popularity of projection mapping. Tracking tools used to place an object, person or camera in 3D space have been implemented and developed for the last twenty years. Who expected sports graphics, video mapped cathedrals, and moving lights following a stage performer to all meet as one production toolset?
It took the language and practice of real-time content generation, as well as the use of previsualization tools in film production, to bring all these pieces together.
In the live events community, most every media server employed some kind of effects tools to live augment video content. Colorizing a video file or treating it with some distortion is a form of real-time content manipulation. Several media servers took the next step, generating content from a bit of code and user parameter definitions rather than starting with a video file. But it took the software, Notch, to rapidly advance the use of code generated imagery for use in media playback for live events.
If you are familiar with Notch, you will quickly understand the value of a real-time visual system when it comes to responding to external information and controls. Sound, light levels, physical forms, mocap data can all be employed to impact the way an image was generated. It is a natural evolution of this product for users to experiment with camera position data within the media servers to impact the real-time production of content.
The disguise platform has led the charge of a very short list of XR solutions, providing users an ecosystem that blends spatially driven content mapping tools with camera tracking and real-time content generation. While there are a few virtual studio solutions for AR & Green Screen, the disguise xR solution provides an LED screen + AR + scenic extension toolset. This combines the best of the benefits of shooting in an LED volume, while having the immersive feel of a live green screen replacement workflow. There is an evolving list of Virtual Production solutions and their capabilities at the end of this article.
Over in the film industry, VFX is a history book unto itself. I will call out the advancement of previsualization, motion capture and robotic camera controls was essential to creating more realistic visual effects. What is critical is the collision of these previz tools with gaming engines like Unity and Unreal, along with the improvement of projection and LED Screen quality. Armed with all these technological advancements, I assume someone in a dark office surrounded by computer monitors thought, “wouldn’t this shot be easier to capture in camera than replace in post?” Ah, I wish it was as easy as saying that, but we are getting there.
What we have now is the beginning of a spatially sensitive production workflow with high quality background replacement and real-time content generation technology. My hope is the pre-production shot planning and previz of film production finds its way into the multi-camera live event workflow. This hybrid process is the full realization of XR Production.
We need time and understanding to build out this new approach. When a client calls asking for XR, it is possible they mean any one or all of these tools. Don’t assume they know exactly what they want or why they want it. You have to engage your client to speak to the production goals to fully understand what they actually want and how to deliver that, no matter what your client asked for.
Coming up in Part 2: we’ll talk about how to deliver what your client asked for when the best solution is XR.