What We’ve Been Working On, Part 1: GameStop Ads
We have news. In addition to shooting a couple of more episodes of The FOO Show (they’re in post-production now, and will be posted once we work through final edits and QA), with more in pre-production, we’ve been working on some projects that use our virtual reality tools for non-VR applications. The first of those just went live, so I’m thrilled that I can tell you about it.
For the last few months, we’ve been working with GameStop and The Richards Group on a new ad campaign for GameStop that uses FOO’s animation tools and our experimental virtual studio. Using our tools, we’re able to create animated character performances using a workflow that’s closer to a live action shoot than a traditional animation pipeline. The first spot debuted this week, with more coming soon.
So how did this happen? The Richards Group contacted us several months ago, after seeing The FOO Show. They wanted to see if we could help them put two new characters, Kyle and Brooks, into some of the biggest video games of the year. GameStop and The Richards Group wanted to build on the machinima spots that they had done in the past, while simultaneously giving the creative team more flexibility with the main characters — giving them movement and responses beyond what’s possible with traditional machinima. And, of course, we needed to do all of this on an aggressive schedule, which wouldn’t leave time for more traditional computer animation. We hadn’t built FOO with commercial work in mind, but we were excited to take on the challenge and the were thrilled at the opportunity to showcase our tech in front of a national audience.
So what exactly is happening in this spot? First, we needed to capture the background footage from Destiny 2. I’d never seen that kind of in-game capture before in person, so I was surprised at how technically difficult it was to get exactly the shots we needed from the game. It’s part deconstructing the videogame’s AI, part nature documentary with uncooperative animals.
The editing team took the hours of footage that was shot in Seattle and edited it down into a series of plates — the background footage we were going to integrate Kyle and Brooks with. To do that, we reconvened in Austin with the actors — Camrus Johnson plays Kyle and Brian Morabito plays Brooks — on a most unconventional set.
Because all of our work happens virtually, we don’t need traditional cameras or lights on set. The physical set is just a big room with good sound and enough space for the cast and crew. The virtual set, where all the magic happens, is a little more complex.
For this shoot, we put Kyle and Brooks on an enormous virtual green screen set. All of the lighting happens in VR (or is handled by our post-production partners, depending on the shoot), and all of our cameras are virtual. Our “cameras” are simply high-end gaming PCs hooked up to HDMI capture rigs that run our camera client — a special version of the FOO client.
We offer several options when it comes to camera controls, but the one we’re using for the GameStop spots mimics a standard camera drone. We map the typical drone controls to a standard Xbox controller so the camera operator can “fly” the virtual camera around on our virtual set. Using the gamepad’s two sticks and triggers gives us six full degrees of freedom for the camera operator and also lets them take advantage of the muscle memory they’ve built flying drones.
Then it’s just down to the performances. When the actors put on their off-the-shelf HTC Vives, they essentially step into the virtual studio. As always, we just use the actors’ movements, as reported by the Vive, to animate their avatars.
When they’re wearing the headsets, we use some complicated math and the three points of data we get from the Vive (head and two hands) to mirror the actor’s movements on the avatar. Using inverse kinematics and machine learning lets us animate believably human characters using very little actual data. If the hardware doesn’t track something directly, like eye or mouth movement, we build a software model to handle it. There’s a lot going on under the hood to make this all work.
Of course, the actors don’t have to worry about that— all they have to do is put on the headset and act. When they’re in the studio, the actors can see each other, the cameras, any cues they need for the shot, and a monitor that gives them immediate feedback on their performance. Once they’re in the virtual studio, it’s similar to working on a real-world set. The director calls “Action!” and the actors run their performances.
This allows us to easily record multiple takes of the actor’s performances — something that’s time-consuming and expensive in traditional animation. This is one of the key benefits of the FOO approach. We can shoot multiple takes of a performance really easily, which gives GameStop more options when they’re looking for a perfect 30 second spot.
What does it mean for our clients? By using FOO, GameStop and The Richards Group are able to create animated shorts on a timeline that’s faster than traditional animation can approach, with the cost and flexibility of a live action shoot.
I’m thrilled about this new application of the work we’ve done at FOO. We’ve melded motion capture, video game tech, traditional animation, and virtual reality to make a completely new pipeline for 2D animation. This ad spot is the first iteration of that, but we’ve already made significant changes for the next shoots that are part of this campaign. Beyond that, we’re thinking about ways to use FOO to reduce and remove the barriers that make 2D animation so time-consuming and expensive and limit it to big studios with big budgets.
But wait, there’s more! Next week I’ll be able to share some of the other exciting projects we’ve been working.