Core Haptics in Depth

Feb 19 · 9 min read

By Gernot Poetsch

In the first part of this article, we covered the basics of Core Haptics, the Apple Framework for custom haptic events on the iPhone. In this article, we will dig deeper and discuss creating custom haptic content, exporting it as an AHAP file (Apple Haptic and Audio Pattern), and integrating that content into an application.

An AHAP file is the most convenient way to specify complex haptic patterns — in this form, haptic events and sounds are specified together in the same place. Getting started with the AHAP file format is easy: treat it like a normal file, as you would with an MP3, and “play” it.

There are currently two ways to generate an AHAP file: manually write the code, or, use one of several visual tools.

Creating AHAP content

It is possible to write code for a complete AHAP file in your favorite text or JSON editor. However, this method is hardly comfortable, and, we recommend a visual tool to provide a better way to create and preview the haptic effect and then generate the AHAP file. Here is an overview of three currently available AHAP creation tools:

Captain AHAP

This is the first AHAP editor available to the public, a web-based application, found at It has a companion app on Github, very fittingly called Moby. Captain AHAP and Moby are free and Moby is open source(1).

Using Captain AHAP, you can place events, parameters and parameter curves on a visual timeline and generate an AHAP file. It’s a great tool to experiment with if you want to place and modify events manually and have a clear understanding of the outcome.

(1)The great folks at Fancy Pixel, who made Captain AHAP are also part of the team that built Lofelt Composer. Thanks!


This is the youngest of the three tools, and similar to Captain AHAP in functionality. It also lets you place events, parameters and parameter curves, but with a native Mac App and a companion iOS app for playback. Both are available in the App Store, and the Mac app costs around $10. While it requires deep knowledge about the building blocks of an AHAP file, it provides precise numeric control over all aspects of the pattern, and does a good job at visualizing all events and parameters.

Lofelt Composer

Lofelt Composer is a web-based tool with an iOS-based companion app, and is the impetus for writing this article. It has a different approach than the other two options: it doesn’t require the user to place events manually. Rather, the user can simply drop an audio file onto Composer and it generates a complex series of events and parameter curves automatically. As can be seen in the screenshot above, these event and parameter curves are presented to the user as a single haptic envelope that can be fine-tuned as needed. This audio-driven approach is very useful for games and any other application development where you already have an audio component, or if you try to make the haptics mimic real-world examples for which you have created an audio file.

All three tools have editors and companion apps, so when experimenting and composing a pattern, the feedback loop of trial-and-error is especially important. Lofelt Composer makes setting this up very easy: open the tool in the browser and with the companion app on your iPhone; then, scan the displayed QR code. For the rest of the session, your phone is in sync with the browser, and you can immediately feel the pattern changes you make in real time. Once you are happy with the result, copy the AHAP file to the clipboard and integrate it into your app.


Having created your AHAP file content using one of the tools, you can easily copy the file into your existing workflow to integrate into your app.

How to Play an AHAP file

A simple AHAP player can be written in Swift with very little code:

let patternURL = ... //URL to the AHAP, preferably a local file

let engine = try CHHapticEngine()

try engine.start()

try engine.playPattern(from: patternURL)

That is the easiest way to play a pattern, sufficient for experimenting. In a production project, you should check if the haptic engine is available and do proper error handling. Also, the engine has handler blocks for stopping, resetting and finishing, which should be provided in a production app.

If you need more control over the playback, instead of using .playPattern(from:) on the engine, you can instantiate a CHHapticPatternPlayer and have that play a CHHapticPattern.

If you want to pause, seek and continue, set playback speed, or loop a pattern, use CHHapticAdvancedPatternPlayer instead. For example, with dynamic parameter injection at runtime (more on that later), a looping advanced pattern player can be used for continuous real-time feedback in games.

One unusual aspect of CHHapticPatternPlayer and CHHapticAdvancedPatternPlayer is that they are protocols, not classes. The actual types that implement the protocols are not exposed to the developer. The advanced player protocol inherits from its regular variant - the implication of this protocol-based approach is that the underlying types that implement them don't have to mirror that inheritance. This represents a rare example of Apple conforming to their pattern of "Protocol Based Programming in Swift" in a system framework. Apple claims that haptic players are extremely lightweight and fast to create - this design choice might have had a role in achieving that

The AHAP file format

Here is an example of a simple AHAP file structure:

{ "Version": 1.0, "Metadata": { "Project": "My Sample }, "Pattern": [ { "Event": {...} }, { "ParameterCurve": {...} } ]

If you take a closer look into it, however, it is a JSON file with a dictionary at its root that has three keys: version is currently set to 1.0; an optional Metadata dictionary that might contain a creation date and the project name; and, a Pattern key that includes the actual haptic events. As any JSON file, it can be edited or modified with a text editor — as long as it is valid JSON and follows the AHAP specification.

The most interesting part of the file is the Pattern section: it contains an array of dictionaries, each of which can have the keys: Event, Parameter or ParameterCurve. Let’s look at them more closely.


Events are, in type and structure, what we have already discussed in the previous article: transient and continuous haptic events and their audio counterparts. To play audio in sync with the haptic events, the waveform needs to be referenced from the pattern. But as complex and potentially large binary data, they can’t be included in the JSON file but instead, need to be referred to as an external source. The path to the waveform has to be absolute or relative to the bundle where the AHAP file resides, which is usually the app’s main bundle. If you want to add waveform audio in code, use .registerAudioResource(…) and related methods on CHHapticEngine. This generates a CHHapticAudioResourceID that can be used to create a haptic audio event for this waveform.

Events have an EventParameters array which modifies the properties of the complete event in which they're specified. Those parameters are specified in an array in the event and have a type and a value.


In addition to the EventParameters, which were discussed in the previous article, there are “dynamic” parameters that can be modified during playback of a haptic event at runtime. To make things even more confusing, those dynamic parameters are only called Parameter in the AHAP file and are at the same level as the events. They affect all events at the same time and have slightly different names than their non-dynamic event-based counterparts: instead of HapticIntensity, there is HapticIntensityControl; AudioVolume is AudioVolumeControl; and, the ADSR parameters (the standard Attack, Decay, Sustain, Release envelope stages), like AttackTime, become HapticAttackTimeControl. Audio can also be modified with ADSR envelopes, in this case, use AudioAttackTimeControl and so on. In addition to Type (called ParameterID) and value, they have a time, which determines their position in the pattern, because they can change during an event.

Also, to be specified in a pattern, dynamic parameters can be added to an already-playing pattern at runtime:

func sendParameters([CHHapticDynamicParameter], atTime: TimeInterval)

This command, sent to a CHHapticPlayer or CHHapticAdvancedPlayer, causes the parameters to be applied immediately (by specifying time as 0) on an already-playing haptic, or at a specified time. The change is applied abruptly without the value slowly ramping up or down.


If you want dynamic parameters to change linearly over time, use a ParameterCurve. In contrast to Parameter, the changes are not applied immediately at a given time, but the changes are interpolated in a linear fashion. In all other aspects, parameter curves are used the same way as non-curve parameters: they affect all events in a pattern; they are put in the same hierarchy level as events and dynamic parameters; and, they can be added to an already-running player at runtime, too:

func scheduleParameterCurve(CHHapticParameterCurve, atTime: TimeInterval)

Parameter curves naturally have a duration whereas parameters without the curves create singular events applied at a point in time. So, while parameters are applied when they occur, parameter curves can overlap and mix with each other, or with singular parameters. The behavior depends on the type of parameter: for example, parameter curves for intensity are combined with subtractive modulation, whereas sharpness curves are additive.

Lofelt Composer makes very heavy use of parameter curves to create haptics that closely resemble the waveforms that are provided as input. From our measurements, we have learned that in the current implementation of Core Haptics, parameter curves are limited to 16 points each. But, as it affects the whole pattern, it is possible to use a chain of multiple curves, which, in the end, will have the same effect as one very long curve.

What’s in the future for CoreHaptics?

In its first version, Core Haptics is already very well thought-out and feels mature. While it’s hard to tell with an iPhone-only framework, it seems to be ready for adaptation to other kinds of devices. Third-party authoring tools are coming on the market, so patterns can easily be created. There are also a considerable number of iPhones with Taptic Engines in users’ hands to make it an attractive platform for creating content.

The current implementation supports the use in non-game apps pretty well: sounds within haptic patterns respect the ring/silent switch and the behavior is as intended when the user expects the phone to stay muted. In games, however, the user needs are different: the developer can choose an AudioSession that overrides the ringer switch, but this preference is currently not translated to haptic audio. Right now, when the switch is off, the haptics work, but stay silent — no matter the choice of the current AudioSession. Audio in iOS is a complex topic in and of itself, but we hope Apple considers the needs of game developers in a future release, and enables them to match the audio behaviour of the haptic engine with the rest of the app.

Oh, and while we’re at it: AHAP files as push notification “sounds” would also be awesome…

In addition, we wish Core Haptics would be expanded to at least support the Apple Watch in the future. And, who knows what else Apple has in mind: for example, the trackpads already have a Taptic Engine. Game Controllers are starting to have haptics, and if the next Playstation is getting a linear actuator in its game controller, then why shouldn’t Apple support something similar from iOS? They like to do gradual rollouts of their technologies, starting with using it themselves, then releasing limited pre-baked functionality (here it started with UIFeedbackGenerator) and then giving full access to developers and expanding that from one device category to the next. This strategy is in full effect with Core Haptics, and as we see haptic feedback used more prominently outside of Apple, we’re convinced that this is just the beginning of the journey.

In the future, we hope for even more dynamic features: the haptic engine reacts in real-time to sound; or, integrates more seamlessly with games and game engines. But even now with these new advancements, you can create amazing effects that surprise and delight the user. Developers are just starting to explore the tools, and users are just starting to feel what’s possible.

Originally published at on February 19, 2020.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store