Getting started with RealityKit: Record + Replay AR Sessions

Part 5: Stay sat down when creating your AR apps

Max Fraser Cobb
Aug 26, 2019 · 4 min read

There’s an amazing, semi-hidden tool for developing AR apps that has been released by Apple this year. originally thought this was just something for use within Reality Composer, since Xcode 11 you can record actions and camera feed to replay in any ARKit app. This includes SceneKit, Metal, SpriteKit, Unity, and most relevant for this article; RealityKit. Personally I haven’t used Reality Composer much, but a fellow AR enthusiast here in London showed me how to use this tool a little while ago.

This would have been particularly useful when I was previously working on AR Navigation solutions, but I wouldn’t have had so much exercise if this were a feature back then!


Recording the Session

To get started, open up Reality Composer and launch either an existing project or create a new one. From there tap the type of Anchor you want to use. If you want to use the recorded session for a scene which initially looks for a horizontal plane, choose that, otherwise you can do the same for a vertical plane, image, face, or even a scanned object.

In summary, the steps are:

  1. Open Reality Composer.
  2. Open existing project or create a new one.
  3. Tap the 3 dots menu, then the </> Developer button.
  4. Tap Record AR Session.
  5. Record the session and then share to your computer.

The GIF below goes through the above steps on an iPad:

While recording the session, if you tapped on the Horizontal option when creating a new project, you’ll see some helpful boxes appearing on the floor in-front of you; these boxes will let you know how ARKit is doing with detecting the horizontal surface. This works the same way for vertical surfaces. Here’s an example:

Recording an AR Session using Horizontal anchoring

When this recording is replayed it will essentially be paused on your device, but the scene’s entities will still animate and be able to be interacted with, with some limitations which I’m still to uncover completely.

Once the video is recorded, hit the share button and send it to your computer by whatever means work best for you and open up your ARKit project using Xcode 11.

Replaying the Session

Next edit the scheme, the steps to do so are:

  1. Edit Scheme…
  2. Run > ARKit > Hit the checkbox
  3. Locate your file using the drop down menu
  4. Then load the app to your device.

here’s a GIF of how to do so.

Attaching the video to the run scheme for your device

Next when you run your app on your device it will run as though you’re scanning and moving around the same environment as you’re recording.

I have saved a few horizontal AR Sessions which I plan on re-using for multiple projects. This feature is part of Xcode 11, so is still in beta at the time of writing this, therefore you may encounter some minor issues when using this. But even if it doesn’t run perfectly every time it’s still significantly speeding up AR development for myself.

One Step Further

If your app doesn’t require any user input, and you don’t even want to look down at your phone you could mirror your screen using QuickTime. There are a ton of articles on how to do this, such as this one:

The result looks like this:

View your AR App using QuickTime

I’m expecting Apple to let this work with the simulator at some point, but until that happens this is the next best thing in my opinion.


Thanks for reading, and don’t forget to 👏 if you’ve enjoyed this article, and stay tuned for more.

Max Fraser Cobb

Written by

Augmented Reality Engineer and mathematician. Excited for the future of AR and what amazing people are going to create.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade