The art of providing haptic touch feedback — XR accessibility Part 2

Raju K
XRPractices
Published in
6 min readDec 26, 2022
Image Copyright: Noah Kalina (Flickr)

Haptic feedback, or haptics, is a technology that uses touch to communicate information to users. Traditionally, mobile haptic feedback was achieved through the use of vibration sensors in the device. However, these vibrations often did not feel intuitive or natural to users. To address this issue, Apple introduced the Taptic Engine in its iPhones, a linear actuator that provides haptic feedback that is more realistic and natural than traditional vibration sensors. The Taptic Engine allows developers to create a wide range of haptic sensations, from subtle taps and vibrations to more robust and pronounced impacts, in order to provide a sense of touch and interactivity to users. This technology has greatly improved the usability and accessibility of haptic feedback in mobile devices.

While the Taptic Engine has been around for some time, it is being increasingly used in iOS applications to provide haptic feedback for user interface (UI) interactions. This technology can also be leveraged to enhance the accessibility of mobile augmented reality (AR) experiences for vision impaired users.

One way to utilize haptic feedback in mobile AR for accessibility purposes is to provide haptic feedback when the user interacts with virtual objects. However, in order to do this, the system must be able to detect when the physical device (such as a smartphone) collides with a virtual object. To solve this problem, the system must be able to track the position of the device reliably within the virtual world.

In iOS, the ARKit framework provides a solution to this problem through the use of ARWorldTrackingConfiguration. When starting an ARSession, developers can set the default configuration to use worldAlignment set to .gravityAndHeading, which enables the system to track the position of the device in the virtual world based on its orientation and movement.

var defaultConfiguration: ARWorldTrackingConfiguration {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
// The below line is super important
configuration.worldAlignment = .gravityAndHeading
configuration.isAutoFocusEnabled = true
configuration.environmentTexturing = .automatic
return configuration
}

In order to provide haptic feedback when a user interacts with virtual objects in an AR experience, it is necessary to add a virtual object to the ARKit scene view and assign it a SCNPhysicsBody. The SCNPhysicsBody is an object that defines the physical characteristics of a 3D object in a SceneKit scene, such as its mass, shape, and collision behavior.

In the provided code snippet, a class called PhoneNode is defined that represents our phone as a virtual object in the ARKit scene. The PhoneNode class initializes a SCNBox object with specific dimensions (in my case iPhone 12 mini), which serves as the geometry for the virtual object. It also creates a SCNPhysicsShape object using this geometry and assigns it to the virtual object's SCNPhysicsBody. The SCNPhysicsBody is set to a dynamic type, indicating that it will be moving within the scene, and its category and contact test bit masks are set to specific values. IMPORTANT: Without these 2 Bit mask parameter set, ARKit physics simulation will not happen for the virtual object. categoryBitMask tells whats the tag of this physics body, and contactTestBitMask tells what are all the other bitmasks that this object is allowed to interact or collide with.

The virtual object is hidden from view so that it does not obstruct the user's view of the scene.

import SceneKit

class PhoneNode {
let node: SCNNode
private let boxCollider: SCNBox
private let physicsShape: SCNPhysicsShape

init() {
// Rough iPhone 12 mini dimension with some buffering portrait mode
self.boxCollider = SCNBox(width: 0.15, height: 0.075, length: 0.01, chamferRadius: 0)
self.physicsShape = SCNPhysicsShape(geometry: self.boxCollider, options: nil)
self.node = SCNNode(geometry: self.boxCollider)
// .dynamic as the phone will be moving in the scene
self.node.physicsBody = SCNPhysicsBody(type: .dynamic, shape: self.physicsShape)
self.node.physicsBody?.categoryBitMask = ObjectType.PhoneType.rawValue
self.node.physicsBody?.contactTestBitMask = ObjectType.VirtualObjectNodeType.rawValue
self.node.name = "phonenode"
}

func updateLocation(location: simd_float4x4) {
self.node.simdTransform = location
}

}

In the provided code snippet, a class called VirtualObjectNode is defined that represents a virtual object in an ARKit scene. The SCNPhysicsBody of this node is set to a static type, indicating that it will not be moving within the scene, and its category bit mask is set to a specific value. The virtual object is given a name and added to the ARSCNView's scene as a child node.

import ARKit
class VirtualObjectNode {
private var node: SCNNode
private let physicsShape: SCNPhysicsShape

init(position: SCNVector3) {
let transform = simd_float4x4([1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 1, 0], [position.x, position.y, position.z, 1])
let box = SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0)
let material = SCNMaterial()
material.diffuse.contents = UIColor.red
material.specular.contents = UIColor.white
material.metalness.intensity = 1
material.shininess = 50
material.lightingModel = .physicallyBased
box.materials = [material]

node = SCNNode(geometry: box)
node.position = position

self.physicsShape = SCNPhysicsShape(geometry: box, options: nil)
self.node.physicsBody = SCNPhysicsBody(type: .static, shape: self.physicsShape)
self.node.physicsBody?.categoryBitMask = ObjectType.VirtualObjectNodeType.rawValue
self.node.name = "virtualobject"

}

func addToSceneView(_ sceneView: ARSCNView) {
sceneView.scene.rootNode.addChildNode(self.node)
}
}

ObjectType enum used in the bit mask variables are shown below. These constants are used to specify the categories for different types of objects that may be present in the ARKit scene and can interact or collide with each other during the ARKit physics simulation. In this case, the PhoneType and VirtualObjectNodeType constants are defined and can be used to specify the categories

import Foundation
enum ObjectType : Int {
case PhoneType = 1
case VirtualObjectNodeType = 2
}

In the viewDidLoad method, we create aPhoneNode object and add it to the scene as a child node of the root node of the ARSCNView. This adds the PhoneNode object to the scene and makes it available for the ARKit physics simulation. Also we add a virtual object at 2 meters away from the camera.

override func viewDidLoad() {
super.viewDidLoad()
self.phoneNode = PhoneNode()
sceneView.scene.rootNode.addChildNode(phoneNode.node)

let virtualObject = VirtualObjectNode(0,0,-2) // Place it 2 meters away from the user camera
virtualObject.addToSceneView(sceneView)
}

In the viewDidAppear method, we start the ARKit session using the default configuration specified in the defaultConfiguration variable in the beginning of this article. The defaultConfiguration variable sets the world alignment of the ARSession to .gravityAndHeading, which enables the system to track the position and orientation of the device in the virtual world based on its movement and orientation.

override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)

guard ARWorldTrackingConfiguration.isSupported else {
fatalError("ARKit is not available on this device.")
}
sceneView.session.delegate = self
sceneView.session.run(defaultConfiguration)
let physicsTriggers = PhysicsTriggers()
sceneView.scene.physicsWorld.contactDelegate = physicsTriggers
}

Next, we continuously update the position of the PhoneNode upon every frame update in the AR Session so that the phones position is updated in the virtual world correctly

func session(_ session: ARSession, didUpdate frame: ARFrame) {
if (phoneNode != nil) {
phoneNode.updateLocation(location: sceneView.session.currentFrame!.camera.transform )
}
}

You will also notice that the viewDidAppear method also sets the contact delegate of the ARSCNView's physics world to an instance of the PhysicsTriggers class. The contact delegate is an object that conforms to the SCNPhysicsContactDelegate protocol and is responsible for handling contact events between SCNPhysicsBody objects in a AR session.

import SceneKit
class PhysicsTriggers: NSObject, SCNPhysicsContactDelegate {
private var feedbackGenerator: UIImpactFeedbackGenerator

override init() {
feedbackGenerator = UIImpactFeedbackGenerator(style: .medium)
}
func physicsWorld(_ world: SCNPhysicsWorld, didBegin contact: SCNPhysicsContact) {
if let nodeA = contact.nodeA.name, let nodeB = contact.nodeB.name {
if (nodeA == "phonenode" && nodeB == "virtualobject") || (nodeA == "virtualobject" && nodeB == "phonenode") {
feedbackGenerator.impactOccurred()
}
}
}
}

The PhysicsTriggers class that conforms to the SCNPhysicsContactDelegate protocol. This class is responsible for handling contact events between objects in the ARKit scene that are participating in the ARKit physics simulation since we set it as contactDelegate earlier in the viewDidAppear method. The instance variable feedbackGenerator of type UIImpactFeedbackGenerator initialized with a .medium style. The UIImpactFeedbackGenerator class is a haptic feedback generator provided by the iOS system that allows developers to create haptic feedback sensations that simulate physical impacts or collisions. The .medium style produces a haptic sensation that’s like a gentle tap on the shoulder, reminding user to pay attention when the generator's impactOccurred() method is called.

When the user waves the phone in the area where we placed the virtual object, the physicsWorld(_:didBegin:) method of the SCNPhysicsContactDelegate protocol is called. In this method, the names of the two objects involved in the contact are checked and, if one of them is the PhoneNode object and the other is the SpatialAudioNode object, the impactOccurred() method of the feedbackGenerator is called to produce a haptic feedback sensation. Congrats! Now your project is all set to generate haptic feedback when phone interacts with a virtual object.

< Part 1 — Understanding Spatial Audio

--

--

Raju K
XRPractices

Innovator | XR | AR | VR| Robotics Enthusiast | Thoughtworks