Getting started with RealityKit: Record + Replay AR Sessions
Part 5: Stay sat down when creating your AR apps
There’s an amazing, semi-hidden tool for developing AR apps that has been released by Apple this year. originally thought this was just something for use within Reality Composer, since Xcode 11 you can record actions and camera feed to replay in any ARKit app. This includes SceneKit, Metal, SpriteKit, Unity, and most relevant for this article; RealityKit. Personally I haven’t used Reality Composer much, but a fellow AR enthusiast here in London showed me how to use this tool a little while ago.
This would have been particularly useful when I was previously working on AR Navigation solutions, but I wouldn’t have had so much exercise if this were a feature back then!
Recording the Session
To get started, open up Reality Composer and launch either an existing project or create a new one. From there tap the type of Anchor you want to use. If you want to use the recorded session for a scene which initially looks for a horizontal plane, choose that, otherwise you can do the same for a vertical plane, image, face, or even a scanned object.
In summary, the steps are:
- Open Reality Composer.
- Open existing project or create a new one.
- Tap the 3 dots menu, then the
- Tap Record AR Session.
- Record the session and then share to your computer.
The GIF below goes through the above steps on an iPad:
While recording the session, if you tapped on the Horizontal option when creating a new project, you’ll see some helpful boxes appearing on the floor in-front of you; these boxes will let you know how ARKit is doing with detecting the horizontal surface. This works the same way for vertical surfaces. Here’s an example:
When this recording is replayed it will essentially be paused on your device, but the scene’s entities will still animate and be able to be interacted with, with some limitations which I’m still to uncover completely.
Once the video is recorded, hit the share button and send it to your computer by whatever means work best for you and open up your ARKit project using Xcode 11.
Replaying the Session
Next edit the scheme, the steps to do so are:
- Edit Scheme…
- Run > ARKit > Hit the checkbox
- Locate your file using the drop down menu
- Then load the app to your device.
here’s a GIF of how to do so.
Next when you run your app on your device it will run as though you’re scanning and moving around the same environment as you’re recording.
I have saved a few horizontal AR Sessions which I plan on re-using for multiple projects. This feature is part of Xcode 11, so is still in beta at the time of writing this, therefore you may encounter some minor issues when using this. But even if it doesn’t run perfectly every time it’s still significantly speeding up AR development for myself.
One Step Further
If your app doesn’t require any user input, and you don’t even want to look down at your phone you could mirror your screen using QuickTime. There are a ton of articles on how to do this, such as this one:
How to mirror your iPhone with a Lightning cable using QuickTime in OS X Yosemite
If you’re on OS X Yosemite, you can now easily create screen captures and mirror your iOS device to your computer using…
The result looks like this:
I’m expecting Apple to let this work with the simulator at some point, but until that happens this is the next best thing in my opinion.
Thanks for reading, and don’t forget to 👏 if you’ve enjoyed this article, and stay tuned for more.