Getting Started with Face Tracking in ARKit on iOS
Building a Glasses “try on” experience with Reality Composer & ARKit
What is ARKit
So maybe you’re aware of Augmented Reality, perhaps you’ve seen the countless filters / morphs / floppy bunny ears that apps like snapchat / instagram etc can superimpose onto your face. But you’ve never explored how this actually works.
On iOS these experiences can be built with the ARKit Framework. First announced in 2017 and now in its 6th iteration ARKit can be used to build a wide range of applications from games to navigation to visualising furniture in a room or a new paint colour on a wall.
In this post we will explore the basics of the face tracking feature in ARKit and placing objects (a 3D Model of some glasses in this case) onto your face and have them move around with you as if you were wearing them.
What you’ll need
All you need to get started is Xcode (I’m using v14) and a device with a front facing True Depth camera or A12 Bionic chip or later (iPhone X or later running iOS 13 or later)
Setting the scene
The first thing you will need is the 3D Model you are going to use, these can be found from various sources online, I used sketchfab which has some good free to download resources. Just make sure to find one in the
.usdz file format.
Once you have your model open up Reality Composer from the menu in Xcode:
Xcode > Open Developer Tool > Reality Composer
Then Create a new Document:
File > New and choose a Face anchor. It’s up to you if you want to uncheck
Use template content — this will just add a “Hello World” thought bubble to your scene.
Once the scene appears with the face anchor, drag your
.usdz model file onto the window to add it to the scene. You might need to zoom out to see the model depending on the size. Select it and use the Green, Red & Blue position controls along with the Scale slider in the Transform Pane to move the model into place on the face anchor similar to below.
Once you are happy, save the scene somewhere easy to find as we will need it later and we can move onto some code!
Bringing it to life
Create a new project in Xcode and choose the iOS
Augmented Reality App template. I used
SwiftUI as the Interface and kept the Content Technology as
RealityKit . This will give you a sample AR project, if you run it on your device now you’ll see a 3D cube in front of you:
Go ahead and remove the
Experience.rcproject file that contains the cube scene from the project. Now add to the project the Reality Composer (
.rcproject) file that you saved with your face anchor scene.
First we need to add ARKit so add it beside the other imports at the top of the file:
Next we need to tell the app to read from our newly added face anchor Reality Composer file so change this line
let boxAnchor = try! Experience.loadBox()
to (assuming you called your file “Glasses”):
let faceAnchor = try! Glasses.loadScene()
Next we need to tell the ARView to use a face tracking configuration so that the app uses the correct user facing camera. Add the following below the
let faceTrackingConfig = ARFaceTrackingConfiguration()
Finally change this line
Run the app on your device again and the app should now be using the user facing camera and you should see yourself with some fancy new glasses on! 🤓
If it doesn’t look quite right you can select your Reality Composer file in Xcode and click
Open in Reality Composer then make any positional tweaks and save any changes before running the app in Xcode again.
Where to go from here
We’ve only scratched the surface of what you can do with Face Tracking in ARKit in this post. Take a look at this post from Kodeco to explore how you can attach emojis to specific facial features (eyes, nose, mouth) and manipulate them using facial expressions.
The finished sample project is available here on Github.
Thanks for reading and I hope you enjoyed!