The Indie Virtual Production Revolution with Unreal Engine

How Unreal Engine can help small teams and indie movie makers to boost their workflow and create realistic VFX shots in real-time with virtual production.

Alessio Regalbuto
XRLO — eXtended Reality Lowdown
13 min readFeb 18, 2022

--

Cory Strassburger in an extract from his My Alter Ego unleashed into the Metaverse. Ep1

In the last five years, an astonishing evolution in the virtual production industry has taken place all over the world, empowered by the more accessible technologies for indie producers, the need to respect Covid rules while filming new movies, and the advances in the real-time VFX pipelines in Unreal Engine.

In this article, I want to share my personal experience from working in this industry; provide an overview of this flourishing industry; discuss the revolutionary tools that everyone can now use to produce their own movie, and highlight the challenges behind the scenes.

What is Virtual Production?

Nowadays, virtual production assumes a very broad range of definitions due to its rapid evolution. In an attempt to capture them all, we can define it as an innovative video processing technique that seamlessly combines physical and digital elements using real-time software and 3D tracking devices specifically designed for this purpose.

Extract from Unreal Build: Virtual Production 2020 Sizzle | Unreal Engine

Among all applications, placing real actors in 3D virtual environments and giving life to virtual actors are the most common use cases for which virtual production is being employed by movie companies and live broadcasting channels.

Example of placing real actors in 3D virtual environments. Extract from UNREAL ENGINE 5 — VIRTUAL PRODUCTION SHOWREEL 2021 (JARO ATRY)
Example of giving life to virtual actors. Extract from Crafting the Ultimate Digital Human for Virtual Production | Unreal Engine

To the joy of many producers, today these and similar use cases can be implemented basically for free using Unreal Engine and its plugins and virtual sets. This represents, especially for indie producers, a real revolution and an opportunity to step into this market even with very limited resources.

The use of Unreal Engine for Virtual Production

In 2018, I started working with professional virtual production systems and I was particularly surprised to find out that two of the main software used to make real-time VFX were Unity (which has been used while shooting The Lion King) and Unreal Engine.

Back then, only a few tools were available within these engines to make virtual production simple enough for public use. Thus my team and I had to manually deal with a lot of issues like synchronization, wrong video formats being detected causing dropped frames and visual artifacts, and wrong colours resulting from incompatible render passes. This required a lot of coding to solve all these challenges and to create custom automated configurations.

After all, both Unity and Unreal were initially designed for games development, and most of the rendering pipelines and mechanics were not fully supporting the features required for a professional workflow in virtual production.

Credit: Disney. Extract from this video. More information in this article by Magnopus

However, Unreal Engine has evolved drastically in the last years, and the majority of those troubles have been solved with the latest tools offered by Epic Games (such as OpenColorIO, nDisplay, and Composure). These remarkably boosted many movie makers and broadcasting producers, which are now free to use Unreal for their virtual production projects availing themselves of much simpler systems and comprehensive workflows that have helped the release of content such as the The Mandalorian and the Fortnite World Cup live show, in very short times. This trend is expected to continue in the future according to financial forecasts and market analysis.

Extract from Virtual Production Sizzle Reel 2022 | Unreal Engine
Extract from Marshmello Fortnite World Cup 2019 (LIVE) Concert

Why an Indie Virtual Production Revolution?

People might think that only expensive LED walls or movie sets can accommodate producers to create their movies and real-time VFX, but the reality is that a simple green screen, some affordable tracking devices, and a camera could be enough for a streamer or a hobbyist to start successfully experimenting with virtual production.

This was made possible by Unreal’s new virtual production pipeline, which significantly reduces the times and costs of production compared to traditional approaches and makes room for new customized and simplified solutions that everyone nowadays can benefit from.

The virtual production pipeline & Unreal Engine

Diagram from https://www.unrealengine.com/en-US/virtual-production

Traditionally, filmmakers were forced to use a sequential production pipeline that took a lot of effort, time, and especially money in the process. This typically implied three steps:

  1. Shoot a video of the performance with the real actors;
  2. Apply to the recorded footage some Previz* manually designed by artists to serve as a guideline for the post-production team to apply the visual effects afterwards, following the producer’s requirements;
  3. Add the VFX with final pixel quality in post-production, replacing the Previz with higher quality models and animations.

*Previz are visual aids that give the producer an idea of how the final shot would look like when the visual effects will be ultimately applied to the original shot. The name refers to the “pre-visualization” of a drafted version of the VFX overimposed to the real footage performance.

In this perspective, a single mistake along the whole pipeline could have meant unrecoverable losses of hundreds of thousands of dollars with several difficulties to fix it if not facing huge additional costs and production times.

Since Epic Games stepped into virtual production, a more flexible and iterative approach has been adopted, in which not only a director could quickly change a final edit multiple times, but the live shooting of actors and the visual effects could be visualized in real-time simultaneously saving a considerable amount of money and time. As a result, indie producers can enter the market and produce their shots faster and cheaper than ever before.

More affordable camera and motion tracking solutions

In conjunction with the software advancements covered by Unreal Engine, plenty of new hardware has recently entered the market, providing motion capture and camera tracking capabilities at very reasonable costs.

Mocap Suits

Extract from Unreal Engine Mograph.com Winbush Student Showcase Reel

A couple of years ago, I used an xSens mocap suit for some virtual production testing. Despite the fact that it is not particularly accurate in terms of room-scale body tracking (the character slowly drifted away due to small inaccuracies accumulating in the body sensors during the use) I found it very easy to set up and quite responsive to arms, legs, head, and chest movements. As a starting point, it could be a valid investment for an indie studio for full-body tracking.

More professional but expensive suits use reflective markers achieving a more accurate room-scale body tracking, but require additional special cameras to track them, which for small indie producers might be overkill. Among them, the ones from Optitrack are a good example.

Camera Tracking and Face Tracking Solutions

In the past, I have worked with Ncam’s camera tracking systems, which also provide a handy plugin on the Unreal marketplace for free to integrate the live tracking with custom rendering pipelines for the engine. With their camera bar, it is possible to accurately track the real camera movements both indoors and outdoors without markers, which represents a good deal for simple setups and small studios.

Another system that I have seen in showcases is MoSys, which uses reflectors and mainly targets indoor spaces. Surprisingly though, in this context even just an iPhone can be a valid alternative for indie producers these days. Thanks to the Live Link Face app and the Unreal Remote 2 app on the App Store, it is possible to move the virtual camera in Unreal just tilting and translating the phone seamlessly or to live capture a performer’s face to drive the one of a virtual character. This is possibly the cheapest and more convenient solution for newcomers in the field, which I recommend for less demanding shooting requirements, especially when live distortion correction and lens tracking are not critical for the shots.

Extract from Daz to Unreal Engine ~ Testing my iPhone X as a Virtual Camera with Unreal Remote 2
Extract from Live Link Face Tutorial with New Metahumans in Unreal Engine 4

It’s also good to know that a few custom plugins for Android exist too, but they are not as well integrated into the engine as the iOS plugins yet.

Homemade tracking solutions

Extract fromCamera Stage Controller | UE4 Virtual Production

It might not be obvious, but nowadays many custom made solutions exist for home-based virtual production sets as well! Some of them use traditional consumer devices like Vive trackers, Kinect, and Oculus trackers. Others get even more creative by using gaming steering and pedals controllers to drive props and the virtual camera in Unreal Engine. These are a remarkable alternative to professional solutions and are slowly becoming a trend among many indie producers and YouTubers. A comprehensive overview of some of these homemade setups has been covered during the Unreal Fest Online 2020.

For convenience you can check the following YouTube videos and tutorials for more examples of homemade tracking solutions and configurations:

  • Creating a Virtual Camera using HTC Vive in Unreal Engine (4.26) (Watch)
  • iPhone Facial Capture with Unreal Engine | Unreal Fest Online 2020 (Watch)
  • New Virtual Camera 2.0 Setup in Unreal Engine using your iOS Device (Watch)
  • How to Setup an iPad as a Virtual Camera using the Live Link VCAM app (Watch)
  • Virtual Production in Unreal: Zoom, Focus, and Camera Tracking using 3 Vive Trackers (Watch)
  • VR Mocap for Unreal Engine — Quick Start Video (Watch)
  • Virtual Camera Overview (watch)
  • Virtual Production with a Projector & Unreal Engine (watch)
  • Camera Stage Controller | UE4 Virtual Production (watch)
  • My Virtual Production Studio At Home (watch)

Which tools are recommended for Indie Virtual Production in Unreal Engine?

To help indie producers start their journey in this fascinating industry, below I have listed some of the most important tools for producing your virtual production VFX shots in Unreal Engine.

Live motion capture with Live Link

Once you have a motion capture suit or a tracking system, you are going to use Live Link to let Unreal decode the position of your face/body/camera in real-time, dealing automatically with the synchronization of your moves with the engine’s virtual character or camera. More info on how to configure Live Link and understand its dynamics can be found on the Unreal Engine documentation website and in this YouTube quick tutorial.

Extract from Unreal Engine Metahuman Live Face App Tutorial

One of the most appealing features is combining Live Link with Metahuman to easily create and control virtual characters in real-time. For convenience, here are some tutorials from YouTube:

  • Unreal Engine Metahuman Live Face App Tutorial (Watch)
  • Calibrate Metahuman and Live Link Face the right way (Watch)
  • Unreal Engine Metahuman Face and Body Motion Capture Tutorial (Watch)

Video Capture your live performance with the Media Framework

All you need to know when dealing with capturing your live video and streaming it into Unreal Engine is covered by the Media Framework, a set of assets and tools within the engine that offer a wide range of supported formats and video sources. From a simple webcam stream to a connected DSLR to more professional cinematic cameras (which usually are more expensive and use SDI cables requiring Aja or Blackmagic video cards to convert an analogue signal into digital format), you can define different video streams to be processed accordingly in real-time.

Extract from Virtual Production in Unreal Engine 4.20

Some useful tutorials can be found on JSFilmz’s and Cinematography Database’s channels on YouTube.

Compositing with the new Composure Pipelines

Once you have a moving character and your live video, it is time to compose the virtual animation with the real footage. This can be done with Composure, which allows you to blend virtual and real footage into a final shot accordingly and in real-time.

The workflow is similar to the one used in Nuke by professional composers to combine VFX and pre-recorded videos, with the difference that in Unreal this can be easily done in real-time and with a few commands.

Additionally, if you are already familiar with the Nuke workflows you can directly stream from Unreal all the real-time rendered sources and compose them into the Nuke graph using the UnrealReader node from the Nuke Server plugin. For a quick reference, you can check this recent tutorial from the Foundry and some additional resources. Note however, that the Nuke license could require a considerable financial investment, and if you are starting your journey with a limited budget, you might want to use Composure instead, optionally combined with a more affordable third party software like DaVinci Resolve for post-production final editing.

Extract from Unreal Engine Virtual Production Composure Output to Viewport

A good resource to learn Composure is one of the Unreal Online Learning courses, Real-Time Compositing Basics. There are also some useful YouTube videos for beginners:

  • Using Composure with a Backplate and HDRI (Watch)
  • Getting Started with Composure | Live Training | Unreal Engine (Watch)
  • Unreal Engine 4 Green Screen Tutorial (Watch)
  • Indie Virtual Production is here! (Watch)
  • Unreal Engine Virtual Production Composure Output to Viewport (Watch)

Prepare your virtual sets with VR Scouting

To considerably improve the perception of space and size of your virtual set, you can use virtual reality within the engine and visualize your entire scene in 3D from your headset. It is very difficult for filmmakers to design a shoot while looking at a 2D window into a 3D world. Putting on a VR headset unlocks the extremely powerful spatialization aspects of the brain to understand how pieces of the environment fit together, and it allows filmmakers to design a shoot using the storytelling instincts they’ve trained throughout their professional lives. With VR scouting it is possible to easily change the lighting, camera setups, add markers and measure your props by using your hands and a simple UI. Especially for VR enthusiasts, this represents a much better alternative to 2D displays when dealing with designing the space of your virtual scene.

Extract from Real-Time In-Camera VFX for Next-Gen Filmmaking | Project Spotlight | Unreal Engine

Let your crew work remotely with VR & Unreal Multi-User Editing

If you have a small team for your indie project, Multi-User Editing will let you e-meet them remotely directly inside your virtual scene in Unreal Engine, using either PCs or VR headsets. This is extremely useful especially for working with team members across the globe, breaking the boundaries of the physical world. Just like being in a real set, this tool will help the director to better control the scene setup while shooting the scene in real-time, and the actors to improve their performances by being immersed in the virtual stage.

Extract from Explore Collaboration with Unreal Engine’s Multi-User Editor | Webinar

For a better understanding of how the Multi-User Editing tool works, I advise you to watch a webinar from Epic Games released on their YouTube channel and a live presentation from Siggraph:

  • Explore Collaboration with Unreal Engine’s Multi-User Editor | Webinar (Watch)
  • Generations — Siggraph2018: Real-Time Live! (Watch)

More information on the relative Siggraph 2018 presentation can be found here.

What’s new in 2022 for indie producers

The Virtual Production Week 2022 by Epic Games

Recently Epic Games organized the Virtual Production Week 2022, a comprehensive event that captures different talks and showcases the opportunities that Unreal Engine has created for the movie industry and especially indie producers to realize impressive virtual production projects even with small teams.

Among the different sessions, one of the most relevant to the topic of this article focuses on how small teams are empowered by the engine’s tools to produce impressive shots and streams.

Believable virtual actors and performances

Still in early access, Epic Games is working on a powerful easy tool that will help creators to produce believable virtual characters for their cinematic shots and videogames.

Metahuman Creator and The Matrix Awakens experience

Extract from Cinematic Lighting for MetaHumans | Inside Unreal

With an unexpected surprise at the end of 2021, Epic Games released a new unbelievable experience made with Unreal Engine and Metahumans: “The Matrix Awakens”. Even if this was only released for consoles and is at the moment not available for the Editor, this project demonstrated how powerful the engine has become in terms of real-time procedural rendering.

Creators can now experiment with the Metahuman Creator early access directly from a web browser to produce their virtual humans and customize them from a list of templates. These are then cloud-saved onto the Epic Account used to login into the tool, and can be imported in any scene through the Quixel Bridge plugin.

This free tool is definitely worth checking, especially if your team is small and might not have enough artists to 3D model realistic humans from scratch.

The era of Virtual YouTubers

People like Cory Strassburger managed to create amazing content for YouTube showcasing how simple it is nowadays to create virtual characters and animate them through innovative virtual production pipelines in Unreal. Specifically for his works, Cory used the iPhone face capture app combined with an Xsens body capture suit to give life to characters like Xanadu, which is probably one of his most popular so far.

In the following videos, he shows how he managed to create his characters from his homemade virtual production room. Check them out!

Where to find more learning resources?

To help you learn more about virtual production and its workflow in Unreal Engine, I wanted to write down a list of useful learning materials available online for beginners and more experienced indie creators:

YouTube channels:

More information on the roadmap planned by Epic Games and Unreal Engine can be found on one of their latest webinars.

I hope you will be successful in your journey and that my contribution will speed your learning up!

--

--