iPhone X: Using Face-Tracking for more than Animojis

Prolific Interactive
Prolific Interactive
2 min readNov 3, 2017

by Virakri Jinangkul, Senior Design Technologist

Eat It! A Face-Tracking Game for iPhone X

iPhone X, released today, officially brings face-based augmented reality to our personal devices. AR has already presented personal and business use cases to demonstrate its value, from educational demonstrations to in-home previews of a product before purchasing. Facial controls take augmented reality to the next level in terms of what companies can offer their mobile users.

Apple introduced us to face-based AR with a fun gimmick that showed our facial movements replicated by 3D characters. While the Animoji itself might not be the most useful application of this technology, it effectively demonstrates the level of detail that Apple’s facial tracking can recognize in our muscle movements, which opens doors to and control other in-app commands through simple changes in expression.

Of course, we wanted to test it out right away. We created a prototype game for the iPhone X as a quick and engaging way to show how facial tracking can be applied to more than mirroring expressions. The game uses facial control as a use case for navigation, demonstrating how AR might play into accessibility by creating more interactive options on the hands-free menu.

But what about more “serious” uses than Animojis and games? As you can see from the video, the prototype for “Eat It!” allows you to move side to side by turning your head, and forward to capture the food characters by opening and closing your mouth. The game was created as a proof of concept to show how any number facial movements could correlate to any function we choose.

With the growing demand for accessibility in apps, future use cases for face-based AR are endless. From navigation and security, to developing face-based commands between connected devices, this is now all possible without the touch of a screen.

--

--