Drawing Together: Behind the Scenes
Inko, a peer-to-peer collaborative whiteboard app for iPad & Apple TV
A couple of weeks ago, we released our new collaborative whiteboard app, Inko. Here’s the story of how it began, and some details of the making.
Before we start, let’s mention that we are releasing today a variant of the original app, Inko — Group Edition, in response to a number of requests we received regarding VPP compatibility. The original app was indeed distributed as a free app + in-app purchases, and was thus not available through Apple’s Volume Purchase Program. This should now make it easier to integrate Inko into schools and larger entreprises. We still hope for better distribution options (i.e., a single app) on the App Store in the future, but this is the best workaround for now.
Back at the end of 2015, we integrated Apple Pencil very early on into our digital notes app Carbo for iPad. We literally fell in love with the pencil. So accurate, and simple. Although it’s a bit slippery when compared to drawing on actual paper, it is otherwise a great digital alternative to a real pen. The end of 2015 was also when the first programmable Apple TV (with tvOS) was released. It was clear that we wanted to do more with the pen, building on our other app’s drawing capabilities. We had the idea of a collaborative whiteboard for quite some time and with the arrival of these devices, but also noticing that schools and entreprises were starting to move more towards digital solutions, it all made sense. And even more so since Apple has started deploying Apple Pencil and more affordable iPad in schools (see the March special event in Chicago).
We decided to start working on Inko in the summer of 2016. The project was at the time, and is still now, based on 3 key aspects:
- seamless collaboration,
- a great user experience focused on drawing,
- large screen mirroring.
We saw the app as if it were a shared piece of paper. Two people in the same room should be able to draw together. Lots of collaboration apps require devices to be connected to internet, that users manage some kind of network setup as well as invites, and they also typically don’t work if there’s no Wi-Fi.
Early on, we noticed Apple had built some great networking capability into iOS that enabled devices to “talk” to each other without requiring any setup. That first appeared in GameKit and then in MultipeerConnectivity (MPC) frameworks. These are Apple-specific extensions to Bonjour that allow NxM direct connectivity over Bluetooth/Wi-Fi radio (but to our understanding, they are not part of the standards).
We don’t have a long track record of networking apps, so we explored all options. It took months. We hesitated between using Apple’s MPC framework, using open source libraries such as CocoaAsyncSocket, or doing our own. We opted for the latter because of a number of limitations we were not able to overcome with the other approaches.
Network-based apps are a special kind of apps. You need to establish the connection between devices, define a protocol for passing messages around, encode/decode. But then the app should also handle the asynchronicity of messages as well as variability in the amount of data it has to process while remaining reactive at all times. Threads, queues, locks, throttling, compression. Fun stuff.
Collaboration is not just about networking. We needed a robust data model that would sync across users. There are a few possible options available like CRDT and OT. We have opted for a centralised repository-like approach that would store editing operations from different users. This is also the basis of the per-user undo system, and we combined that with a local database to store transient snapshots (canvas is bitmap based), so that we can redraw any point of the repo history with minimal number of operations. Repository also implements automatic pruning of older history to limit complexity for longer sessions. This works pretty well in the current star-shaped configuration, but we could well explore other models (mesh configuration).
A large part of the app has been written in Swift, which is very neat especially for representing data structure such as the ones described here. About Apple TV, it appears to be very similar overall and particularly in terms of networking, so that was pretty straightforward. During development, we noticed important changes in network behaviors between iOS 10 and iOS 11 (Inko only supports iOS 11) in the way devices establish a connection: under iOS 10, it would sometimes use Bluetooth instead of Wi-Fi and we had little control over that, but iOS 11 appears to achieve a form of unification of networking protocol to enable the speedier option each time. Finally, we also wanted to play with UDP as this would even enable smaller latencies, but it does not appear to be possible when using the peer-to-peer extensions at this time. So, we’ll try again later 😉.
Drawing Experience & Design
We knew early on that Inko was not a “universal” app running on iPhone and iPad but rather an iPad app that also happens to run on iPhone. Inko is about contents, it is about drawing experience, and about live collaboration. We started the usual way with a standard navigation bar at the top, but this was covering useful drawing space on the screen. We finally opted for a fully transparent navigation bar (only buttons) that reveals the contents behind it, and that would automatically disappear when the pencil approaches while drawing.
For the tools, a rather small toolbar made a lot of sense. We wanted it to work both for the left and right-handed and to be relocatable because sometimes you just don’t want it in the way. We came up with the idea of a toolbar that would be “elastic” like rubber, with organic motion: it would morph into a ball when you grab it, and then you could throw it away where it would stick to some other place. This required quite a bit of force field tuning and other transitioning animations to get it right, but here it is, simple to grab and to reposition to any side. The captures here show the end result of repositioning the toolbar, how it interacts with split screen, as well as how we tune the force field at the core to achieve that.
About the actual drawing experience, and we already discussed that before with Carbo, Inko uses every bit of info that the Apple Pencil provides. Technically, it makes use of precise pen location, 240 Hz sampling, predictive motion/data that is corrected when actual values are available a few tens of milliseconds afterwards, and of course angles and pressure to construct the drawing shape. Inko is supposed to work great with Logitech Crayon as well, although some features like pressure are missing compared to Apple Pencil (feedback appreciated if you have one!). We iterated on the drawing method to further improve it, but also to make it work through the network, as these drawing shapes are propagated to other devices in real time as they occur, by using custom splitting and compression schemes to achieve that instantaneous look & feel. Inko proposes two (mostly) fixed pen sizes + one calligraphic variant that reacts more to pressure/speed/angles. We are already getting feedback to add more drawing capabilities like a “smart pen” ; these are are great suggestions, and we do listen!
Another important evolution from Carbo is the addition of colors to the canvas. Carbo has a monochrome black & white canvas that renders pixel-free contents, even though it’s backed by bitmap data. That monochrome aspect was its signature at the time, and also made it easier to implement on previous generation GPUs (remember iPad 3 with the first Retina display?). Colors were needed though in the context of a collaborative drawing app for obvious reasons. But adding color impacts how data is represented on GPU, how drawing is achieved, how contents are stored to disk (premultiplied alpha wouldn’t work, so we couldn’t use Apple’s Image I/O). In the last couple of months before release, we discovered we could actually use the P3 color gamut of iPad Pro’s but, because of the GPU rendering of Inko, there were interactions with True Tone and Night Shift that we could finally resolve with custom shader computations. The reds are particularly bright on P3 display, it looks unusual & fun, but we added an option for those who’d rather stay with sRGB instead.
Large Screen with Apple TV
Apps that run on a television are a great addition that is probably underused at this time (AppStories discussed it recently). Especially, in the context of a collaborative drawing app, TV apps open the door to classrooms, auditoriums, or meeting rooms in companies as dedicated hardware can be replaced by cheaper, off-the-shelf tablets and TVs, even beautiful 4K ones.
We wanted to go with a native app for Apple TV to offer a great experience. Indeed, had we limited ourselves to iPad display mirroring, we would have lost the ability of displaying full-screen 16:9 contents (iPad is 4:3 which would have showed margins on the TV), as well as faster live feedback, and contents interaction with the remote. The app, just like on iOS, had to be quick: launch the app and join a drawing group with just one press, no complex setup.
Thankfully, as we discovered, tvOS is an operating system that shares a large common foundation with iOS. The GPU rendering and networking parts of Inko were straightforward to port to tvOS, and only the user interface had to be recreated (and not even from scratch since some parts could be made cross platform with iOS). We kept the interface simple and focused: a screen to join nearby groups, the current drawing, the list of drawings, and the list of the group’s participants.
And here’s the quick story about Apple TV remote. Inko lets the user zoom in and out (double-press) and scroll by touch within the zoomed contents using the remote. However, we also wanted a way to interact collaboratively on the active drawing directly from the Apple TV itself. Drawing was obviously not accurate enough because the touch surface is a bit small, and we only added the highlighting tool. While we were at it, we noticed the white-circled remote had a gyroscopic sensor (that tracks rotational motion) and we gave it a test run to implement a laser pointer à la Wii remote. Clicking and wrist rotation can be used to highlight some contents. Unlike the Wii though, which has an absolute reference, we must set the zero position. We do that each time the user enters the highlighting mode by pressing the play button (think of a mini-calibration), so it’s important to point the remote towards the TV at that particular moment.
With Inko, we’re tackling some new fields (education / companies) that we’re not necessarily familiar with. We’ve started discussions about school systems and modern teaching techniques here in Belgium, both with officials & teachers/professors in order to understand certain points: how things are organized, the needs that exist, and also how it compares to other regions of the world (there are important differences obviously).
At this time, Inko already received a warm welcome with an install base nearing 200,000 after just a month. It has been our fastest growing app so far (proud!). We’re in the process of gathering user feedback (there’s quite some of it), and tuning some behaviors of the app like making the evaluation duration longer + improving palm rejection (in 1.0.2), and also adding a VPP version as mentioned previously.
For the future, we already mentioned that we’d like to propose remote collaboration, as it could be very useful in certain use cases. We’ve also received a number of requests to make a better “One Note” app (from Microsoft) for collaborating together. To our understanding, this app is about documents, annotations and data of different types, which is not exactly the same thing as what Inko offers, namely “drawing together”, and that is what we’d like to keep focusing on at this time.