Using Virtual Production methods on small-scale projects.

Lewis Jelley
Storm & Shelter
Published in
9 min readFeb 10, 2022

For all the deep dives I have written (and will write), if you want to grow as a DOP my biggest piece of advice is ‘Say yes, and figure it out later’.

A lot of what I love about what I do comes from committing to an idea and then grappling with how to creatively and technically achieve it. Necessity breeds invention, and there honestly is nothing like a looming client deadline to kick your brain in gear.

Most recently, this happened with our latest project with Clogau. Our friends over at S3 (RIP), had pitched a creative which would require us to capture a collection of small, delicate jewellery set within the Welsh landscapes that inspired the collections.

For anyone who has tried to capture detailed product shots in the wild, you know that we’d be up for a pretty massive task due to the variables in modelling the natural light alone. And given the timescales and no prior recce, there had to be a better route. And I was determined to find one.

And then it hit me… why does it need to be shot on location? Couldn’t we take a leaf out of Hollywood’s book and create a virtual environment to give us more control? Granted, they have a bigger studio and a bigger budget, but there must be a way of scaling down the setup to achieve a similar result.

And we did — here’s how.

The idea

Whilst on-location shooting scenic sequences, also shoot some background plates that we could take into a studio and playback on a massive TV to act as a virtual environment.

For each of the four videos we’d be making, the format would be the same. A series of real-world sequences of the environments that inspired the collection, followed by a product lockup shot believable enough that it had been filmed during the same shoot.

The top-level idea I had was this:

  • Shoot some background plates on location
  • Create a matching tabletop design in a studio
  • Play the captured footage in the background
  • Film the product lockup
  • Trick everyone

Simple right? Well yeah, in principle it is.

The challenge

Making it look real

Making things look real when they’re not, is hard. Getting 95% of the way there is pretty simple with the right tools, but it’s those final details that can make the difference between something looking seamless or the shot taking a one-way trip to the uncanny valley. A lot of this has to do with lighting.

Making realistic sunlight on a big set usually requires even bigger lights and way more space as you’re trying to create parallel light beams at scale. High contrast and collimated beams — dems da rules. Trust me, I’ve tried (and failed) to break them.

Even on large-scale virtual productions like Mandalorian, where huge LED volumes produce real-time renders of building-height worlds, you’ll sometimes see huge banks of powerful lights on cranes above the virtual backgrounds to provide the contrast ratios and directionality needed to fool the viewer.

Fortunately, physics works both ways. What becomes more difficult as you increase the scale conversely becomes easier as you decrease it. At least that was the philosophy of this project.

How we did it

Please note that I am writing this article after we have completed the project, so I’ve put it into a nice story order.

In reality, I find that my ideas and processes behave more like matter circling a black hole; individual figments that slowly begin to interact together before being hurtled together into a finished method… it does work though.

A glimpse into my process

Camera and Lenses

Going off the pitch deck, before I’d even gone down a tech rabbit hole, my instincts were telling me that to pull this off we needed to use anamorphic lenses.

The way they squeeze a wider field of view into a regular 35mm frame causes all sorts of peculiar effects such as horizontal flares, oval bokehs and barrel distortion. This distortion would be integral to helping us blend in the virtual background as it occurs no matter what’s in front of the lens — virtual or not.

It also helps that anamorphic lenses are having their moment with high-end aesthetics, so no one would look at the end film and wonder why we used that style. The only problem was I wasn’t sure we’d be able to afford a set. Or even a camera that could handle it without compromising the resolution.

Thankfully, Cinewest had a set of very affordable Atlas Orion lenses available, and with the help from Alister Chapman’s videos, I decided on the Sony FX9. It doesn’t natively support anamorphic lenses but it does have an e-Mount and a 6K sensor that produces a 4K image, so as long as you use an external monitor to de-squeeze when viewing, you can just de-squeeze and resize the footage in post to give you a 3.2k image.

To avoid a very distorted doubling up of anamorphic footage when I eventually shot the TV screens in a studio, we needed to shoot the background plates on spherical lenses, which we had on our shelf already.

The shoot: On location

For this whole idea to work, it really came down to what we shot on location.

Though my tactic of using background plates would work in theory, unlike ‘true’ virtual environments we wouldn’t be able to switch things up with the press of a button and would be confined to the footage we got. So it had to be good….and usable.

The contrast ratio of the TV we would be using was pretty decent, but it wasn’t nearly good enough to emulate the contrast of looking directly at the sun or a bright light source. So, when it came shooting, we factored that in and made sure that we weren’t fighting strong backlight or direct sun and positioned the main light source out of frame.

Not only would this make the virtual environment more manageable, but the side-light would give the shots that extra cinematic feel that we were going for.

Shooting the plates

When shooting the plates, I basically imagined as if I was shooting the product on location, just without the product there. I looked for frames and angles with points of interest and natural leading lines that could give us an intuitive guide to work with when back in the studio.

As we were going for the most natural look we could, to get the most width with the least distortion I opted for a 35mm focal length, with the aperture set to a deep stop and the lens set to capture as much of the background in focus as possible.

To make things easier, we also pulled out one of the oldest tricks in the book and used VFX mirror spheres. Capturing shots of them on-location later allowed me and my gaffer Matt to better recreate the environment in the studio as we could deconstruct where the main light sources were coming from, their colour and the levels of ambient occlusion. I had never used one before, and although it felt a little old school it was still very effective. If it ain’t broke and all that.

Yeah, those mirror spheres

Also, it goes without saying, but when you’re doing this kind of thing, you need to keep the plates as still as possible. You’d think this would be easy, but I had an uncanny way of making it difficult for myself, especially when Josh had to save me from being washed downstream (don’t ask).

The shoot: In the studio

We’d captured the plates, shot the rest of the footage, now it was time to see if my vision would work.

Virtual environment vs Green Screen

One of the key benefits of using LED volumes (or a really big TV) as a background is that it also becomes a convincing light source — something you simply can’t get from a green screen. It also gives you reflections, which when you’re shooting jewellery is very important and pretty unavoidable.

Aside from the technicalities of the screen itself, the TV was a pretty intuitive solution. Having shot in 4K, we had the flexibility to move the background around and rescale it to fit without losing resolution and once we’d set the screen to 50hz and limited the TV’s gamma/gamut to REC709, we were able to eliminate any flicker and limit any possible gamut colour issues.

Set design

As much as this article mostly focuses on the technicalities of digital production methods, the entire idea couldn’t have been executed without some brilliant practical set design and art direction.

Our Art Director Nikita did a truly fantastic job of taking our reference imagery and location footage and recreating realistic table-top foregrounds on which to place the jewellery. To help the sell we made sure that there were no points where the foreground wasn’t cutting through the background, as this would’ve caused double focus issues and probably given the game away.

Thank you, Nikita!

Lighting

Thanks to the background plates, our written notes and the VFX spheres when it came to figuring out the motivation for the lighting, we’d done most of the work already. We just needed to light the damn thing.

There would be a whole heap of conditions, colours and even a fire pit we needed to emulate, so Matt and I very quickly settled on an LED solution, a mixture of hard and soft sources, all of which could be tuned to any colour, and a series of modifiers (including a fake tree) to shape the light and enhance the realism of the scene.

For a more specific breakdown:

  • Arri Oribitor (Our hard light and our sun)
  • Kino Flo Select 30 (Ambient level)
  • Astera Titan Tubes (Additional wrap and lighting gags)

Colour management on-set

To capture the maximum possible dynamic range and colour information for the plates, we shot in Slog3 Gamma/Gamut, but once the plates were displayed on the TV screen, for all intents and purposes, they weren’t screens anymore, they were real.

And the real world (as much as Cardiff might lead you to believe otherwise) isn’t as grey, desaturated and flat as Slog3.

Using DaVinci Resolve’s Colour Space Transform tool, we matched the plate footage to the TV’s REC709 colourspace, or at least as close as we could. The screen was built to watch The Mandalorian rather than being part of the set, so there was always going to be a little bit of inaccuracy… which we were prepared to fix.

Using DaVinci Resolve, I did some live adjustments to the contrast, saturation and brightness of the backgrounds and help them blend in with the real foregrounds. I steered away from doing anything other than global adjustments as I didn’t want to strain the footage or further limit the spectrum of colours captured.

Besides, this was something that could be further pushed in the grade.

Conclusion

The biggest sign of success from this project is simply that it worked.

Not only did the end video look realistic, but I had a lot of people asking me how we managed to make our product shots look so great on-location (I did of course tell them the truth).

It was a surreal experience to walk over to my monitor and see what looked like a real image come to life in front of me. One thing I didn’t expect was how much shooting moving image plates benefited the final product. The waterfall in Tree of Life and campfire in Affinity really did something fascinating when shot shallow on anamorphic lenses.

Tree of Life
Affinity

I think in future I’d like to develop the format further by looking at ways to perhaps switch scenes in real-time, or perhaps creating a time-lapse plate of the sun passing across a scene, and then rigging lighting in a studio to try and sell the passing of time. With motion control tech becoming more accessible and affordable than before, I can’t see why you couldn’t augment some pretty wild FPV stuff with this method too.

So yeah, all in all, I’d certainly recommend this technique to anyone attempting a similar result, especially with product commercials. If you have any further questions, just give me a shout.

--

--