How to add CoreHaptics to your iOS app

Ash Slone
Bilue Product Design & Technology Blog
4 min readMar 7, 2022

What is CoreHaptics?

Core Haptics changes the game for Haptics when compared with UIFeedbackGenerator. When using the generator it wasn’t easy to time multiple haptics and also sound together. With Core Haptics one can create a json like structure, the file type is actually ahap, however the file structure looks json. With this file structure one can describe how a haptic should be pretty easily. The ahap file supports multiple haptics alongside sound as well. Each haptic object has multiple options to help configure it but the main ones I’ll be talking about are time, intensity, and sharpness.

CoreHaptics has been available since iOS 13. Let’s take a look at how to implement CoreHaptics with a double tap and also have sound play.

The repo for this article can be found here.

Why use Haptics?

Haptics engage people’s sense of touch to enhance the experience of interacting with onscreen interfaces. For example, the system plays haptics in addition to visual and auditory feedback to highlight the confirmation of an Apple Pay transaction. Haptics can also enhance touch gestures and interactions like scrolling through a picker or toggling a switch.

How does one initialise and play the CoreHaptics file?

When initialising and starting the Haptic Engine, it’s better to do it ahead of time as it could delay the playing of the haptic.

I’ll be showing this from a SwiftUI perspective.

Let’s create a State property for the CoreHaptics engine first. This State property is required here since the engine is being mutated

@State private var engine: CHHapticEngine?

Before attempting to initialise and start the engine we need to check if the hardware supports haptics.

guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
return
}

Now if this is true we can attempt to initialise and start the engine

do {
engine = try CHHapticEngine()
try engine?.start()
} catch {
print("There was an error creating the engine: \(error.localizedDescription)")
}

So now we can combine this into a method to help us later.

@State private var engine: CHHapticEngine?func prepareHaptics() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
return
}
do {
engine = try CHHapticEngine()
try engine?.start()
} catch {
print("There was an error creating the engine: \(error.localizedDescription)")
}
}

Let’s look at how to play the haptic

We will start off doing the same as we did for preparing haptics as well which was checking if hardware supports haptics.

guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
return
}

Next we will grab the path for the haptic file itself.

guard let path = Bundle.main.path(forResource: "haptic", ofType: "ahap") else {
return
}

Now let’s try to play this haptic.

do {
// Start the engine in case it’s idle.
try engine?.start()
// Tell the engine to play a pattern.
try engine?.playPattern(from: URL(fileURLWithPath: path))
} catch {
print("Failed to play pattern: \(error.localizedDescription).")
}

So now we can combine this into a method to help us later.

func playHaptic() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics,
let path = Bundle.main.path(forResource: “haptic”, ofType: “ahap”) else {
return
}
do {
// Start the engine in case it’s idle.
try engine?.start()
// Tell the engine to play a pattern.
try engine?.playPattern(from: URL(fileURLWithPath: path))
} catch {
print("Failed to play pattern: \(error.localizedDescription).")
}
}

How to put all of what we learned together?

Now that we have learned how to prepare and also play a haptic, let’s combine this in a view to test out our work.

import CoreHaptics
import SwiftUI
struct ContentView: View {
@State private var engine: CHHapticEngine?
var body: some View {
Button(
"Play haptic",
action: playHaptic
)
.onAppear(perform: prepareHaptics)
}
func prepareHaptics() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
return
}
do {
engine = try CHHapticEngine()
try engine?.start()
} catch {
print("There was an error creating the engine: \(error.localizedDescription)")
}
}
func playHaptic() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics,
let path = Bundle.main.path(forResource: "haptic", ofType: "ahap") else {
return
}
do {
// Start the engine in case it's idle.
try engine?.start()
// Tell the engine to play a pattern.
try engine?.playPattern(from: URL(fileURLWithPath: path))
} catch {
print("Failed to play pattern: \(error.localizedDescription).")
}
}
}

What does a CoreHaptic file look like and what properties are available?

Haptic events can be represented in an ahap file that resembles json.

We can represent multiple events in a file There are multiple EventType options, we will just focus on a few of them.

  • HapticTransient is a brief impulse occurring at a specific point in time, like the feedback from toggling a switch.
  • audioCustom is an audio event using a waveform that you supply

All of these EventTypes can be represented by EventParameters in an array; ParameterID and ParameterValue.

There are a bunch of ParameterID options, we will just focus on a few of them.

  • HapticItensity is the strength of the haptic event. The value is between 0 and 1.
  • HapticSharpness is the degree to which the haptic event feels tingly, or persistent. The value is between 0 and 1.
  • AudioVolume is the volume or loudness of the sound. The value is between 0 and 1.

Note

A value of 0 doesn’t mean that the parameter has no effect, or that the absolute value of the parameter is zero. It’s a normalised value that maps to the minimum value the system supports. Likewise, a value of 1 doesn’t indicate 1 second; rather, it maps to the maximum supported value.

All events should also have Time which is the floating-point start time, relative to the beginning of the pattern.

HapticTransisent

"Event":
{
"EventType": "HapticTransient",
"EventParameters":
[
{
"ParameterID": "HapticIntensity",
"ParameterValue": 0.6
},
{
"ParameterID": "HapticSharpness",
"ParameterValue": 1
}
],
"Time": 0.1
}

AudioCustom

"Event":
{
"Time": 0,
"EventType": "AudioCustom",
"EventWaveformPath": "audio.wav",
"EventParameters":
[
{
"ParameterID": "AudioVolume",
"ParameterValue": 0.75
}
]
}

More info on CoreHaptics

Apple Docs

Repo

--

--