An Introduction to ARKit 2 — Object Scanning

Mohammed Ibrahim
4 min readJun 30, 2018

--

The next part of the series is Object Scanning, a completely new feature that came with ARKit 2. The rest of the series is as follows:

  1. Image Tracking
  2. World Mapping
  3. Object Scanning
  4. Vision Integration

The Basics

Object Scanning is also a new feature in ARKit 2 (surprise!). It allows you to scan and detect objects and trigger AR content to go around or on those objects.

In the first part, we introduced image tracking — that allows you to dynamically scan and detect images that you add to the app. This is limited to the surfaces and images in two dimensions.

Object Scanning allows you to scan and detect objects in 3D.

Object scanning demo at WWDC

Scanning Real-World Objects

There are two ways to add a ARReferenceObject to your app:

  1. Add the scanning functionality to your app or make a separate scanning app.
  2. Use Apple’s demo scanning app to quickly scan and export objects

Let’s break it down.

1. Adding scanning functionality to your app.

Adding scanning functionality to your app is super convenient as everything can be done within the same app. This means you don’t have to export your reference objects and then add them to your app with every addition — it all happens in the same app.

Here is the basic flow:

  1. Initiate a ARObjectScanningConfiguration to enable high-fidelity data collection that is needed for object scanning.
  2. After scanning an object in a session with the above configuration, call createReferenceObject(transform:center:extent:completionHandler:) to specify the origin and location of the object, the centre of the bounding box, and then the width, height, and depth to extract starting at the centre of the bounding box.

It’s pretty complicated to eyeball or assume while developing, so creating a user interface that allows the user to select the area to scan is a good approach.

Refer to Apple’s demo app here to see how you can add the user interface to your app.

2. Using Apple’s demo app to quickly scan and export objects

If you don’t want or even need to add the scanning functionality to your app, you can always just use Apple’s demo scanning app to quickly scan and export objects from that app.

When you export, you can add it to Files (iCloud Drive) or send it to your Mac using AirDrop.

Download the demo Xcode project here.

Detecting Real-World Objects

Once you’ve got .arobject files, either from the Apple demo app or from your own app’s scanning capabilities, you can use them to then detect the objects and interact with them.

Adding reference objects to your project

  1. Create an AR resource group to your project’s assets catalog.
  2. Drag the .arobject files from the Finder into the AR resource group.
  3. You can also add an identifier name to each of the reference objects.

Loading reference objects

Load the reference objects you want to detect as ARReferenceObject instances and then provide those objects to the detectionObjects property of an ARWorldTrackingSession .

Here’s a example of setting the configuration and its detection objects:

let configuration = ARWorldTrackingConfiguration()guard let referenceObjects = ARReferenceObject.referenceObjects(inGroupNamed: "gallery", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
configuration.detectionObjects = referenceObjects
sceneView.session.run(configuration)

Detecting the objects

Once you’ve added the reference objects to your AR configuration, the app will automatically detect reference objects when it sees them and adds a ARObjectAnchor to the list of anchors in your session.

You can then respond to that by using one of the delegate methods, such as renderer(_:didAdd:for:) . Here is an example:

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if let objectAnchor = anchor as? ARObjectAnchor {
node.addChildNode(self.model)
}
}

This function runs at every frame and finds new anchors. If an object anchor is found, you can then (in this example) add another node to the space above it for example in order to react with the object.

That’s it! Object scanning and detection has plenty of potential applications — it could be used as a way to interact and learn more about art pieces such as statues at a museum, for example. The possibilities are endless!

Have fun with it! Be sure to leave feedback — would be greatly appreciated! Thank you!

More where this came from

This story is published in Noteworthy, where thousands come every day to learn about the people & ideas shaping the products we love.

Follow our publication to see more product & design stories featured by the Journal team.

--

--

Mohammed Ibrahim

WWDC 18 Scholar | CoherentHub | iOS Developer | UI Designer