RealityKit 911 — How to enable both Vertical and Horizontal plane detection

Andy Jazz
Mac O’Clock
Published in
4 min readJul 28, 2020

When prototyping in Reality Composer it is obvious that we can’t activate a simultaneous vertical and horizontal plane detection, because there are radio buttons in a menu, not a set of options. But many developers can’t guess how to do it even in RealityKit. My story will shed a light on this topic.

Radio buttons allowing you enable World, Image, Face Tracking or Object Scanning configuration

In ARKit, however, there is a straightforward approach — we have to use a planeDetection instance property that conforms to OptionSet protocol. This property has two options: vertical and horizontal:

let config = ARWorldTrackingConfiguration()
config.planeDetection = [.vertical, .horizontal]
sceneView.session.run(config)

If you want to know what type of alignment your ARPlaneAnchor belongs to, you could check it with a simple if-else-if statement.

func renderer(_ renderer: SCNSceneRenderer,
didAdd node: SCNNode,
for anchor: ARAnchor) {
guard let planeAnchor = anchor as? ARPlaneAnchor
else { return }
if planeAnchor.alignment == .horizontal {
print("Horizontal")
} else if planeAnchor.alignment == .vertical {
print("Vertical")
}
}

RealityKit

RealityKit has AnchorEntity class with convenience init that targets planes — init(plane:classification:minimumBounds:). There are three alignment options available for plane detection case: vertical, horizontal or any.

let planeAnchor = AnchorEntity(.plane([.vertical, .horizontal], 
classification: [.wall, .floor, .ceiling],
minimumBounds: [1.0, 1.0]))

Take into consideration: this initializer doesn’t work with Simulator app, so make sure that in Xcode’s Active Scheme a physical device is chosen.

Let’s programmatically create a box primitive in ViewController.swift file and pin it with our planeAnchor object:

import UIKit
import RealityKit
class ViewController: UIViewController { @IBOutlet weak var arView: ARView! override func viewDidLoad() {
super.viewDidLoad()
let boxMesh: MeshResource = .generateBox(size: 0.25)
let modelEntity = ModelEntity(mesh: boxMesh)
let planeAnchor = AnchorEntity(.plane([.any],
classification: [.any],
minimumBounds: [0.5, 0.5]))
planeAnchor.addChild(modelEntity)
arView.scene.anchors.append(planeAnchor)
}
}

A schematic representation of a scene and its anchors’ collection tethering models, lights, tracked bodies, environmental audio and cameras looks as simple as that:

Nonetheless, in ARKit, anchors’ collection can be reachable via ARSession’s object, not via Scene’s object.

Reality Composer + RealityKit

Let’s assume that we’ve created a simple scene containing a clock model in Reality Composer. For this project we have chosen a World Tracking configuration with a vertical plane detection.

Reality Composer’s “Interior Items” scene with a model named “Clock

As we have only vertical alignment here we could add a horizontal alignment as well. At first we need to read in this scene in Xcode. For that we have to use Swift’s try! operator because we’re loading a scene with a throwing function.

let sceneAnchor = try! Experience.loadInteriorItems()

Now, all we have to do is to get to anchoring component in scene hierarchy, to assign both vertical and horizontal alignment options.

Let’s take a look at scene’s hierarchy, to find out where anchor object with its corresponding anchoring component is located. But before it we should give descriptive names to our scene and our anchor. It’s quite easy:

sceneAnchor.name = "myScene"
sceneAnchor.children[0].name = "myAnchor"
sceneAnchor.children[0].children[0].name = "myEntity"

Then just print the property.

print(sceneAnchor)
Classical Parent-Child hierarchical dependency of Reality Composer’s scene

According to hierarchy, a path to anchoring component is now obvious:

sceneAnchor.children[0].anchor?.anchoring

Paste it in our code:

class ViewController: UIViewController {    @IBOutlet weak var arView: ARView!    override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let sceneAnchor = try! Experience.loadInteriorItems() sceneAnchor.children[0].anchor?.anchoring =
AnchoringComponent(.plane([.horizontal, .vertical],
classification: .any,
minimumBounds: [0.2, 0.2]))
arView.scene.anchors.append(sceneAnchor)
}
}

Anyway, for many of us, a better approach here is to retrieve a model entity from scene hierarchy and re-attach it with a new anchor of any desired type. To do that we have two options. The first one is as follows:

let sceneAnchor = try! Experience.loadInteriorItems()
let clockEntity = sceneAnchor.children[0].children[0].children[0]

And if Reality Composer’s model has a name (and, yes, it does have a name, remember?) then there also was a variable with a similar name automatically generated by Reality Composer. So the second option is much more convenient, isn’t it?

let clockEntity = sceneAnchor.clock!

Full version of our code might look like this:

class ViewController: UIViewController {    @IBOutlet weak var arView: ARView!    override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let sceneAnchor = try! Experience.loadInteriorItems()
let clockEntity = sceneAnchor.clock!
let anchor = AnchorEntity(plane:[.horizontal, .vertical],
classification: .any,
minimumBounds: [0.2, 0.2])
anchor.addChild(clockEntity)
arView.scene.anchors.append(anchor)
}
}

Voila!

That’s all for now.

If this post is useful for you, please press the Clap button and hold it.

¡Hasta la vista!

--

--