Let’s rewind to MacWorld SF in 2008, the year after Apple introduced the iPhone. Steve Jobs introduces the original MacBook Air that touts a “very generous trackpad” with “Multi-touch” gesture support.
Turns out that the multi-touch trackpad on it was pretty awesome, especially when compared to all other trackpads at the time. Swiping, pinching, and rotating were ported over from the iPhone, and it actually felt natural on these trackpads and on the Mac. Despite Apple not providing much at first for allowing 3rd party developers to interface with multi-touch devices, almost immediately developers were finding ways to grab multi-touch input data and build out apps with custom gestures and more. The earliest example of this is MultiClutch, which came out less than two months after that Steve jobs introduction.
Within the next couple of years, Apple was selling the standalone Magic Trackpad and Magic Mouse, and the support for multi-touch gestures from within apps got a lot of attention and support. Apps could even listen to system wide multi-touch events through Quartz Event Taps, although that would eventually become barred from the macOS App Store. Additionally, event taps allow you to listen to trackpad input but not Magic Mouse input. This is where the private Multitouch framework comes into play.
The private framework processes input after the raw data from the device has been processed into human readable values, but before the data can be piped through a CGEventTap (as CGEvents and NSTouches). The data we get from the private framework actually includes more information than the supported CGEventTap provides. Some of this information includes the velocity of finger movement and the angle or axis values for the ellipsis of your finger tip as it is touching the device, which is pretty cool. Also, the framework gives us data for any of Apple’s multi-touch devices.
There’s definitely some major downsides to using this private framework though. The biggest is that Apple doesn’t guarantee that we will have access to it. This came to a head when Apple decided to disallow public usage of the framework entirely in a macOS 10.13.2 beta build. There must have been enough outcry from people using apps that use the private framework, (BetterTouchTool, JiTouch, and MagicPrefs to name a few) that Apple decided to reverse their decision and again allow the framework to be used. Along those lines, Apple could decide to change the framework in ways that could break any apps that rely on it. With that said, it doesn’t look like it has changed in any breaking ways for the many years that it has been in existence. Furthermore, any app that uses private frameworks cannot be sandboxed and therefore cannot be listed on the macOS App Store.
Being that this framework is private, there’s no documentation to go with it and everything we do know has been strictly from some pretty smart people that did some grueling work of reverse engineering it. The first steps of grabbing the symbols from the framework aren’t very hard, but beyond that it takes a lot of sifting through assembly code and trial and error to figure things out. With a little searching, it’s easy to find a good bit of resources for reverse engineering on macOS/iOS. Thankfully, we can reap the benefits of what other people have done, and it’s actually pretty straightforward to use the framework and get touch data out of it. Some quick reference examples include ofxTouchPad, FingerMgmt, and Touch/Sesamouse.
You’ll notice that most examples using the private framework are at least a few years old and are not written in Swift. Thankfully, it’s not a crazy amount of effort to get everything working in Swift. In a future post, we’ll dive into more detail on how to do this. In the meantime, check out my app called Multitouch.