ARKit 911 — Image Tracking in Swift Playgrounds

Andy Jazz
Geek Culture
Published in
4 min readJan 7, 2022
In compositing apps like NUKE, 1-point tracker allows you to record a XY-position of image, 2-point tracker — XY-position and Z-rotation, 3-point tracker — XY-position, Z-rotation and scale, and 4-point tracker — corner pinning. Most AR frameworks use these transform cases along with Machine Learning algorithms.

The purpose of this story

For some unexplained reason, Swift Playgrounds does not allow us using the Resources folder by its name, even if the print command happily informs us: “Everything’s OK, sought image is inside Resources folder”.

Screenshot of Swift Playgrounds 4.0 for iPad
print( Bundle.main.path(forResource: "Photo", ofType: "png")! )

A printed result looks like this:

/var/mobile/Containers/Data/PluginKitPlugin/
A0E15417-9B30-45D7-AB2D-9B8C03EE97FA/Library/
Application Support/Main Module Resources Bundles/
8EE548B0-9675-438F-9571-242D17EA12F5.bundle/
Contents/Resources/Photo.png

And therefore, in my story I want to show you a trick how to get around this. To move further, load all the reference images into the Resources folder.

Xcode approach

Many developers are used to referenceImages(inGroupNamed:bundle:) type method that loads all reference images in the specified Resources group in your Xcode project’s asset catalog:

class func referenceImages(inGroupNamed name: String,
bundle: Bundle?) -> Set<ARReferenceImage>?

It returns optional Set<ARReferenceImage> that can contain up to one hundred unique reference images.

Resources folder in Assets.xcassets

Here I’ve used ARImageTrackingConfiguration:

func imageTrackingConfig() {    guard let resources = ARReferenceImage.referenceImages(
inGroupNamed: "Resources",
bundle: nil)
else { return }
let config = ARImageTrackingConfiguration()
config.trackingImages = resources
self.sceneView.session.run(config)
}

However, if you try to use this method in Swift Playgrounds, you will fail, because inGroupNamed argument requires you to write a name of resources folder, but you cannot.

Swift Playgrounds approach

Instead, you have to use ARReferenceImage class initializer called init(_:orientation:physicalWidth:). It creates a new reference image from a Core Graphics image object, where physicalWidth parameter is the real-world width, in meters.

public init(_ image: CGImage,  
orientation: CGImagePropertyOrientation,
physicalWidth: CGFloat)

Paste my method into your ViewController class:

fileprivate func imageTrackingConfig() {

let imageFromWeb = #imageLiteral(resourceName: "Photo.png")

let image = ARReferenceImage(imageFromWeb.cgImage!,
orientation: .up,
physicalWidth: 0.5)

self.trackingImages.insert(image)

let config = ARImageTrackingConfiguration()
config.trackingImages = trackingImages
self.sceneView.session.run(config)
}

The rest of the code looks familiar to you, doesn’t it?

import ARKit
import SceneKit
import
PlaygroundSupport
class ViewController: UIViewController {

let sceneView = ARSCNView(frame: .zero)
var trackingImages = Set<ARReferenceImage>()

override func viewDidLoad() {
super.viewDidLoad()
self.view = self.sceneView

sceneView.delegate = self
sceneView.scene = SCNScene()
sceneView.autoenablesDefaultLighting = true

self.imageTrackingConfig()
}
}
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.liveView = ViewController()

And then extension at last:

extension ViewController: ARSCNViewDelegate {

func renderer(_ renderer: SCNSceneRenderer,
didAdd node: SCNNode,
for anchor: ARAnchor) {

if let _ = anchor as? ARImageAnchor {

let geo = SCNTorus(ringRadius: 0.1, pipeRadius: 0.05))
let donut = SCNNode(geometry: geo)
donut.geometry?.firstMaterial?.lightingModel = .phong donut.geometry?.firstMaterial?.diffuse.contents =
UIImage(named: "Photo 1.png")
node.addChildNode(donut)
}
}
}
“Photo 1.png” is donut’s texture

This should definitely work.

Donut model tethered with ARImageAnchor

And finally, a few lines for those who want to implement Image Tracking in RealityKit. The code looks a little bit different than in the first case, but the main difference is the use of the session(_:didAdd:) instance method.

import ARKit
import RealityKit
import PlaygroundSupport
class ViewController: UIViewController {

let arView = ARView(frame: .zero)
var trackingImages = Set<ARReferenceImage>()

override func viewDidLoad() {
super.viewDidLoad()
self.view = self.arView
self.arView.cameraMode = .ar

self.arView.session.delegate = self

self.imageTrackingConfig()
}
}

You can see that AnchorEntity class has an init(anchor: ARAnchor) convenience initializer which helps us to achieve the desired result.

extension ViewController: ARSessionDelegate {

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {

if let imageAnchors = anchors as? [ARImageAnchor] {
for imageAnchor in imageAnchors { let entity = ModelEntity(mesh: .generateBox(
size: 0.1))
let anchor = AnchorEntity(anchor: imageAnchor) anchor.addChild(entity) self.arView.scene.anchors.append(anchor)
}
}
}
}

For those using SwiftUI, coordinators need to be implemented. SwiftUI coordinators are designed to act as delegates for UIKit view controllers. You can see how to do it here.

That’s all for now.

If this post is useful for you, please press the Clap button and hold it.

¡Hasta la vista!

--

--