RealityKit 911 — Swift Playgrounds, or how to create and debug AR apps on iPad Pro

Andy Jazz
Geek Culture
Published in
4 min readDec 15, 2021

A few words about coding on iPad

Swift Playgrounds 4.0 for iPad, or younger Xcode brother, is an ideal AR app to start experimenting with AR projects. Many developers who have tried it say it’s a great option alongside another phenomenon — Reality Composer app. What makes Swift Playgrounds so appealing? Well, a few unobvious but positive reasons. Here are at least seven of them:

  • Swift Playgrounds is a free app with full AR functionality
  • You can simulate SwiftUI and UIKit projects right on iPad
  • Compilation time for AR apps is astonishing — almost instant
  • No iOS developer account is needed
  • No restrictions for free accounts, like compiling 10-apps during 7-days
  • There are no info.plist’s keys, like NSCameraUsageDescription
  • No internet connection is required to verify app’s signature
  • Connect integration lets you upload your finished app to the App Store

To tell the truth, though, there is also one annoying drawback — Swift Playgrounds often crashes when I implement Gestures or People Occlusion features. I don’t know how it works on iPad with M1 yet (I use iPad Pro with A12Z chipset) but I hope that this annoying bug will be fixed in future versions.

RealityKit implementation

For this app, I’ve created an anchor 500 meters away from the iPad camera. Remember, the far clipping plane in RealityKit 2.0 is still limited to 1000 meters — so make sure your 3D model doesn’t go outside the camera’s frustum.

Antananarivo

Your code for a similar AR app might look like this:

import SwiftUI
import RealityKit
import PlaygroundSupport
struct ContentView: View {
var body: some View {
return ARContainer()
}
}
struct ARContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.cameraMode = .ar // IMPORTANT let text: MeshResource = .generateText("Madagascar",
extrusionDepth: 15,
font: .boldSystemFont(ofSize: 100),
containerFrame: .zero,
alignment: .center,
lineBreakMode: .byWordWrapping)
let material = SimpleMaterial(color: .red,
isMetallic: true)
let entity = ModelEntity(mesh: text, materials: [material]) entity.orientation = simd_quatf(angle: .pi / 8,
axis: [0, 1, 0])

// 500 meters North (-Z)
let anchor = AnchorEntity(world: [0, 0,-500])
anchor.addChild(entity) arView.scene.anchors.append(anchor) return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.setLiveView(ContentView())

Pay attention, here I used a setLiveView(_:) generic instance method. It displays a view that shows the result of running the code that’s on the current page.

func setLiveView<IncomingView>(_ newLiveView: IncomingView) where 
IncomingView : PlaygroundLiveViewable

You can’t launch SwiftUI app in Swift Playgrounds using liveView instance property, like you did it before for UIKit apps, because liveView is an object that conforms to the PlaygroundLiveViewable protocol:

var liveView: PlaygroundLiveViewable? { get set }

ARKit + SceneKit implementation

It’s nice to know that we are still capable of using SceneKit module for UIKit and SwiftUI projects in Swift Playgrounds. Here’s a direction for you how to create a SceneKit’s scene based on ARSCNView (and even with an anchor).

import ARKit
import PlaygroundSupport
class ARController: UIViewController { let arView = ARSCNView(frame: .zero)
var anchor: ARAnchor!
override func viewDidLoad() {
super.viewDidLoad()
self.view = arView // IMPORTANT arView.scene = SCNScene()
arView.autoenablesDefaultLighting = true
// anchor is 2 meters away
self.anchor = ARAnchor(name: "special",
transform: .init([1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0,-2, 1] ) )
let model = SCNNode(geometry: SCNBox(width: 0.1,
height: 0.25,
length: 0.1,
chamferRadius: 0.02))
model.geometry?.firstMaterial?.diffuse.contents =
UIColor.red
model.simdTransform = self.anchor.transform
arView.session.add(anchor: self.anchor)
arView.scene.rootNode.addChildNode(model) // ARKit config
let config = ARWorldTrackingConfiguration()
arView.session.run(config)
}
}
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.liveView = ARController()

As you can see, I used needsIndefiniteExecution instance property which is a Boolean value that indicates whether indefinite execution is enabled.

var needsIndefiniteExecution: Bool { get set }

Definition from developer’s documentation:

By default, all top-level code is executed, and then execution is terminated. When working with asynchronous code, enable indefinite execution to allow execution to continue after the end of the playground’s top-level code is reached. This, in turn, gives threads and callbacks time to execute.

With a traditional live view, editing the playground automatically stops execution, even when indefinite execution is enabled. When using the always-on live view, execution continues whether or not indefinite execution is enabled.

Set needsIndefiniteExecution to true to continue execution after the end of top-level code. Set it to false to stop execution at that point The default value is false. It's set to true when liveView is set to a non-nil value.

That’s all for now.

If this post is useful for you, please press the Clap button and hold it.

¡Hasta la vista!

--

--