What I Learned From Making ARnold, An Augmented Reality Short Film

Greg Feingold
5 min readMay 21, 2017

--

This was partially adapted from a talk I gave at AfterNow Studios at their AR Meetup in March.

“The dog is floating in the air.”

“Wait… now he’s halfway in the wall. His tail is wagging. Oh dear.”

The first night I demoed ARnold, my thesis project developed at USC’s MxR Lab, I got this comment a lot. ARnold is a short film that takes place in whatever room you are standing in while watching it; it tells the story of a dog who walks around the carpet beneath you and tries to escape a holographic door placed on the nearest wall.

At least… that was the concept. After a semester of work, the spatial mapping and understanding weren’t quite there yet. Arnold spent more time walking through walls than telling his story. We (the team I assembled including me, two developers, two animators and an associate producer) eventually fixed the app’s spatial mapping. Along the way, I learned a valuable lesson about immersive storytelling: using new technologies to tell a story can be incredibly valuable, but if you can’t get the programming absolutely right, the technology will just get in the way of the story.

Here are a few other things I learned from spending my senior year developing ARnold:

1. Stories that are “right for AR” must be deeply personal stories.

As an augmented reality storyteller, I was asking audiences to connect with a physical object that I placed in their own environment. This is, to be frank, asking a lot of my audience. Storytelling using physical objects is challenging because in most cases, the viewer imbues an object with emotional meaning, not a third-party observer. I can’t hand you a hockey trophy and make you remember the hockey match you won in high school. But if you are going through old boxes and come across that trophy yourself, suddenly a whole wave of memories and emotions might wash over you.

Making ARnold was a ruff time.

With ARnold, I picked the story of a dog escaping home after talking to many friends and family members during a brainstorm. It turns out that having a pet trying to escape is a universal experience. Yet it is one that will instantly bring to mind a personal experience for many viewers, which is what makes it apt for AR. Maybe the object I’m placing in your room isn’t your dog, but it it could remind you of the dog you have back at home, or the pet you had long ago. As a storyteller, that is just as powerful.

2. Work with technical limitations in mind.

I already mentioned the spatial mapping challenges inherent in first-generation AR devices. But there are a few more to keep in mind.

There are tons of powerful use cases where we could use spatial understanding to tell context-dependent stories. My mind was spinning with the opportunities: different scenes playing out in different rooms, a couch spawning one character while a holographic video plays on a real-world TV, etc. But as of mid-2017, the HoloLens and Tango still have a lot of trouble determining what objects they are seeing in the real world. There are ways to improve this spatial understanding using Vuforia and openCV, but we are still a ways away from a perfect object database within an AR platform.

Another obvious limitation: field of view. Whether working on Tango or HoloLens or another AR headset, you will be limited by the rectangle in which the AR display can show holograms. Far from a hindrance, this actually helped my development process. We chose the size of our AR dog based on the field of view of the HoloLens, ensuring that Arnold is always in sight of the viewer. Focusing on small objects is a design challenge that can actually yield better storytelling results.

A final limitation: current headsets can only scan one room at a time. We are not at the stage where we can tell multi-room narratives in AR, as cool as that might be.

3. Invent new storytelling techniques.

This was the most exciting part of the project for me. With ARnold I was able to take the film language I had learned in my four years at USC’s School of Cinematic Arts and transform it for an entirely new medium. Some of the techniques I experimented with included:

The “Magic Window”

This technique lets you peer into the virtual world by placing a rendered “window” on a real wall. In the final scene of ARnold, the dog escapes out the front door into the “real world”. We placed a 360 photo sphere on the other side of the animated door (which, in real life, was just a wall). The proof that this technique was a success? During demos, people kept running into walls trying to get a look at what ARnold was doing outside.

The “Fade”

I struggled a lot to come up with a way to direct a viewer’s attention to tell a story the way I want to tell it. One technique I came up with was to fade in different holograms one by 0ne so that the audience follows the objects. In one scene, Arnold is balancing on a stack of books trying to reach the door. We first see one book, then two more, then finally the full stack of books comes into view.

Spatial Audio

Keep in mind, in AR your audience won’t always know where your holograms are. One way to point them in the right direction is spatial audio. (Note: the HoloLens speakers are not conducive for convincing spatial audio, and if you’re demoing on a Tango you better plug in some headphones.)

Thanks for reading! The team I assembled for ARnold is already hard at work conceptualizing our next AR project… stay tuned. ;)

If you have any questions about the development process feel free to reach out at gregjfeingold@gmail.com

--

--