How to animate AR objects with SwiftUI and RealityKit

Millie Dev
Twinkl Educational Publishers
7 min readOct 17, 2019

Reality Composer is a new app for creating AR experiences with Xcode. When a 3D object or Entity has been added to the scene, it allows you to create Action sequences that animate, move or transform that Entity. Each Action sequence comes with a variety of triggers, from collisions with other objects to ‘collisions’ with the camera, from a tap on the screen to the start of the scene. This tutorial shows you how you can use none of these and link SwiftUI controls to custom Action sequences directly instead.

This is the basis of creating a user interface for AR, as the control panel you create here will need to be able to affect objects in the scene.

When you open Xcode, create a new project with the Augmented Reality App template. The first two files you should change are the ContentView and AppDelegate files, which will be exactly the same as they were in my last tutorial, Create your first AR app with RealityKit and SwiftUI. AppDelegate is about halfway down, and ContentView is at the end. AppDelegate sets up a reference to a class called DataModel, which we will create in a moment. This allows us to permanently store data in Swift that would otherwise only exist in the Reality File. ContentView combines the UI and the AR window, if AR is enabled.

You will also need to copy ARDisplayView from that tutorial, which returns the ARView object from the DataModel class rather than creating a new one.

The Reality File uses an Apple file format to store one or multiple AR Scenes and every Entity 3D object inside them.

If you double-click on the Experience.rcproject Reality File, you’ll see a preview of the ‘Box’ scene, which is basically a steel box at the origin. When you press the Open in Reality Composer button in the top right, you’ll be able to edit the file directly. When Reality Composer has opened the file, click the Behaviours button in the top right. This should bring up a panel at the bottom of the screen. Click the plus button to create a new Behaviour, scrolling to the bottom of the pop-up to click on Custom. This allows you to choose any trigger for the Behaviour in your Reality File. Click anywhere in the box that says ‘Add a Trigger to this Behaviour’. Scroll to the bottom of this pop-up and choose Notification. In the Identifier text field of the Notification, write ‘Trigger event’.

Now you can add your own custom action. After you click ‘Add an Action to the sequence’ you can choose whatever Action you want from the list. Emphasise allows you to take advantage of built-in RealityKit animations. You can Hide the box, but as it’s already visible by default, choosing Show won’t do anything. You can move it, add force to ‘push’ it, or play a sound.

Whatever you choose, make sure you choose the steel box as an Affected Object. If you forget to do this until later, you can click the Choose button next to Affected Objects, and then click the steel box in the scene to select it. If you’ve done it right, it should be green. Press the Done button where the Choose button was before. Now you can add one more Action, which will this time be a Notify.That’s right, we want to send a notification from your code to trigger an Action that sends a notification to your code. Without this, our Swift code would have no awareness of whether the notification was successfully passed to the Reality File or not.

Remember to add the steel box as an Affected Object here, too.

Let’s make our user interface, which won’t take long with SwiftUI!

Since we have an action that notifies the code that the Action sequence has been completed, I decided we would make a permanent change to the user interface. When you press the button, a boolean value called buttonPressed turns from false to true. Since SwiftUI doesn’t have if statements, you can’t say ‘if buttonPressed’, give the button one title, and then give it another title in the else condition. Instead we need to use a ternary expression, which uses a question mark followed by the value if it’s false, followed by the value if it’s true.

See more of what our Digital team gets up to by visiting the Twinkl Reality page

There’s no real reason why the user interface has to be permanently changed, but it makes it obvious that the sequence was completed and SwiftUI was aware of that fact. You may notice that we access the wrappedValue of the buttonPressed bool, rather than buttonPressed itself. This is because $data.buttonPressed is a binding, just like the one that is used by the AR toggle at the top. Binding the variable means that SwiftUI has direct access to modify this property, and will update the user interface when the property is changed. The button action has to refer directly to the shared static instance of DataModel, because bindings can’t be used to directly call Swift function.

Now we need to create our DataModel, where we will create the trigger for the action sequence. I gave the function we are calling the generic name triggerAction, as I gave you the freedom to decide what your action would be. If you wanted, yours could be something more specific like ‘moveBox’, ‘spin’ or whatever you chose. Just make sure that your ARUIView has the correct function in its button action. It’s worth noting that if you haven’t created at least one Xcode notification, there won’t be a boxAnchor.notifications enum to access. Like me, you may struggle to figure out why you can’t refer to a notification, until you realise you haven’t set it up in Reality Composer.

We create the ARView object when the DataModel class is initialised, in the same way that it is done in the default Augmented Reality App project template.

The difference here is that we add a block of code to the Action Done notification that we set up at the end of our Action sequence. All we’re doing here is setting the buttonPressed bool to true, so that the user interface will change. Obviously you could do anything in this code block, from calling other Swift functions to posting another notification back to the Reality File. You might notice that the block starts with ‘box in’. This passes back whatever Entity you selected as your Affected Object. If you had another Entity that wasn’t the steel box, you would be able to pass this back as the Affected Object. This makes it easy to have access to an Entity you know will exist and be of use in the block when this Action occurs.

Now when you install this on an iPad, you should have a left side panel with a switch that says AR and a button that says Notify. Point the iPad at a flat horizontal plane like a table or the floor, and flick the AR switch. You may be asked for camera access, and when you allow it you should see what the camera sees. After a few seconds of pointing the device at a horizontal surface, the steel box will appear. When you’re ready to see your steel box in action, press the Notify button. The box should do whatever you chose it to do, and the button should now say Notified. You can press it again, but the name only changes once.

Next Steps

This tutorial introduces some advanced concepts but it still only creates a basic AR app. It’s supposed to be a blueprint for adding further features, as you now have the building blocks in place to trigger any action from anywhere in code. Since the left side panel of the iPad app is a List, you can only add up to 10 children to it. This could mean the AR toggle and 9 buttons. If you end up wanting more controls than this, you can simply use VStacks to separate each group of 10 controls. Since it’s a list, it will grow and scroll automatically as you add more controls.

As the name suggests, Action sequences are designed to have multiple space. You could experiment with chaining different actions together. You could send notifications back to Xcode at multiple points in the sequence. You could even use the Tap trigger in Reality Composer, which is triggered when you tap on the Affected Object. If you notify Xcode when an item is tapped, you could store the Entity that gets passed back as a property of your DataModel class. From there, you could have a ‘tap to select’ function that lets your SwiftUI control whatever object you tapped.

About Rob Sturgeon

Rob is a Placement App Developer at Twinkl. He’s currently working on their iOS and Android apps, enabling teachers to view, organise and download resources on the go.

READ MORE:

How SwiftUI helps kids create their first iOS apps

Create your first AR app with RealityKit and SwiftUI

--

--

Millie Dev
Twinkl Educational Publishers

An iOS developer who writes about gadgets, startups and blockchains. Swift programming tutorials are at typesafely.co.Uk