iOS 16: How ARKit and RealityKit Help Measure Objects Accurately

Building an augmented reality (AR) measurement app

Shawn Sun
4 min readSep 14, 2023

Source code: https://github.com/shawnxdsun/arkit-measure.git

The ARKit framework allows developers to build apps with great augmented reality (AR) experiences on iPhone and iPad with Apple A9 or later processors, starting from iOS 11. As an example, you can build an app that measures the dimensions of spatial objects.

This video shows an instance where the AR app based on ARKit measures the distance between the two points on the laptop keyboard within the span of the camera view.

Measuring distance using ARKit

However, you’ll need to be careful. As you may have noticed, the measurement results aren’t accurate, because the two points that ARKit located and rendered are the spatial points between the laptop keyboard surface and the camera. We need to find a way to make sure the points are located on the surface of the laptop’s keyboard.

Now RealityKit provides high-performance 3D simulation. Combined with ARKit and the LiDAR scanner, it can quickly retrieve information from a wide area in front of the camera and convert it into meshes for the surfaces of the physical environment. With the meshes, we can locate points on real-world surfaces more accurately.

Measuring Objects Using ARKit and RealityKit

You can implement a measurement app using ARKit and RealityKit by following these steps.

1. Import the ARKit and RealityKit frameworks.

import ARKit
import RealityKit

2. Enable and retrieve scene meshes when the app starts.

override func viewDidLoad() {
...
let configuration = ARWorldTrackingConfiguration()
configuration.sceneReconstruction = .meshWithClassification
configuration.environmentTexturing = .automatic
arView.session.run(configuration)
...
}

To begin the AR experience and enable the scene meshes, you need to configure and run the session in the main view controller’s viewDidLoad callback. This way, the meshes are created when the app first starts.

3. Locate the points on a real-world surface.

if let result = arView.raycast(from: tapLocation, allowing: .estimatedPlane, alignment: .any).first {
...
}

Now the app starts and the meshes are retrieved on the surface of the physical environment within the span of the camera view. When the user taps the screen, a ray is performed from the tapped position on the screen cast against the meshes, i.e., the surfaces of the objects. The built-in function raycast of arView helps finish this process. The function returns a result that is the intersection point of the ray located on a real-world surface.

4. Visualize the intersection points.

let pos = result.worldTransform 
let resultAnchor = AnchorEntity(world: pos)
resultAnchor.addChild(sphere(radius: 0.01, color: .red))
arView.scene.addAnchor(resultAnchor)

Visualizing the intersection points helps the user see the real-world location of the points they tapped on the screen and which distance will be measured.

5. Calculate the distance between the two points.

let position = SCNVector3.positionFrom(matrix: result.worldTransform)
let sphere = SphereNode(position: position)
let lastNode = nodes.last
nodes.append(sphere)
if lastNode != nil {
let distance = lastNode!.position.distance(to: sphere.position)
lengthLabel.text = String(format: "Distance: %.1f cm", distance * 100)
lengthLabel.textColor = .red
}

To calculate the distance, create a SphereNode with found points and save it into an array of SphereNode. When you get two end nodes, you can start to compute the distance between them.

Results and Conclusion

Here you can watch the videos of 1) the demo app built based on ARKit and RealityKit under iOS 16 and 2) the app based on ARKit in iOS 11. We used the two apps to measure the diameter of the round bench at the same time. The red grid in the left-hand video represents the meshes created and rendered by ARKit and RealityKit, which help examine and position the points on the surface of the round bench. By comparing the videos, we found that ARKit and RealityKit under iOS 16 measure objects more accurately.

Left: Measurement of 33.1 cm based on ARKit and RealityKit; Right: Measurement of 1.1 cm based on ARKit in iOS 11

--

--