Shared ARKit for Everyone, Everywhere
As you may have heard, Apple released ARKit last month, which is their newly released augmented reality (AR) framework so that developers can develop augmented reality experiences on iOS. This is Apple’s declaration that AR should be more widespread and in the pockets of more people. There has been a lot of hype around AR the past few months (if the success of Pokemon GO was any indication). The technology is great in that it’s been able to enhance the world around you. But augmented reality so far has been a primarily solo and isolating affair that has not allowed users to share augmented reality experiences in real time with those around them. We fully believe that interacting with our world requires interacting with the people around us — why enhance a world with no one in it?
With this in mind, we set out this past weekend to create an experience that no one had seen before: real-time, shared, multi-user augmented reality with ARKit, anytime, anywhere. In other words, we wanted to make it possible for people to see, change, and interact with the same things in their augmented world, at the same time. Ultimately, this entailed creating a solution for realtime state management for shared AR experiences.
We chose to recreate the classic arcade shooter Duck Hunt. State management in games is a nontrivial problem — it requires managing and updating lots of information (e.g. scores, locations, etc.) very quickly, and we wanted to create shared AR experiences in a way that was both robust and flexible. Robust in that your AR experience isn’t tethered to just your house; flexible in that we built an easily extensible and usable framework for anyone to build shared AR experiences. The foundation of our framework that allows for a seamless AR experience, is the peer to peer communication solution we implemented. Our app works on both Bluetooth and Wi-Fi, ensuring both practicality in usage and robustness in real-time syncing.
Before re-creating Duck Hunt, we initially wanted to create a multiplayer AR race-car game where users could go anywhere in the world, place a race track on any surface, and race on a track that was synced up in space. Along the way, we ran into some obstacles that we think the developer community would find invaluable.
In building a shared ARKit experience, everyone must have a synced up Augmented Reality world. There are three components in creating such a synced up world: position, orientation, and state.
The difficulty in syncing up positions is in triangulating each iOS device. Though using CoreLocation or iBeacons may seem enticing, their accuracy is simply not good enough for ARKit experiences. A better way to tackle this is to have a shared location that users scan in the ARKit world. Either by scanning a QR code or scanning a shared plane, you can roughly triangulate positions of different users and use those locations to position objects in the ARKit world accordingly. The last (and most reliable) way you can do it is to have devices start from an initial position in space and then use the same 3-D space for all devices.
Having the positions of each user is not enough alone to build a shared ARKit experience; to sync up each world, you need to have each user’s orientation. .
An initial thought would be to use the compasses from each iOS device by using gravity and heading to fix each user’s orientation according to the compass. And it was a good idea… until we realized the compass on the device did not work. We consistently found inaccuracies between our different iOS devices’ compasses that caused our ARKit experiences to be out of sync. And, unfortunately, there doesn’t seem to be a way to recalibrate the compass in iOS11. Additionally, plane detection would simply not work when this setting was configured.
Shared ARKit experiences require a shared state that updates in real-time. For example, an ARKit game with lag between what two users see is both disorienting and unacceptable. In trying to sync state between multiple users, we ran into some pretty interesting road bumps.
Typically, real-time multiplayer online games use a central server to control state between users. Though a client-server implementation works for that use case, having the roundtrip cost of traveling to a server was simply not good enough for our shared real time AR experience. Although using sockets may work for some use cases, we wanted no roundtrip cost. We had to figure out a way to reduce state change time to <20 ms.
So, we ended up implementing a solution that used peer-to-peer networking to sync state change between users. It works on both Bluetooth, Wi-Fi, and LTE so that you can experience ARKit anywhere, regardless of your internet connection. We can even support up to 8 users in our shared ARKit experience without any loss of quality of the augmented reality interaction.
Additionally, we ran into a whole slew of bugs while using ARKit and SceneKit. For example, if you try to use the SCNPhysicsVehicle API on a vehicle model you have scaled down, SceneKit ends up removing the scale applied to the wheels. This bug in SceneKit proves to be a problem when you can’t scale your 10-foot car down to a 1-foot car. We hope Apple resolves these bugs as more developers start creating ARKit applications.
Back to Duck Hunt. We wanted to bring the classic game into 2017 with an Augmented Reality twist. We had a ton of fun taking on these challenges and building the first public demo of real-time, shared ARKit experience that works anywhere at anytime. We hope you enjoyed reading about what we learned. Ultimately, AR is something that will be in the hands of increasingly more people, especially with the release of iOS11. Hopefully, people can change the world in very interesting ways, together.
Sign up here for our updates on our apps and more shared AR! September 2017 and iOS 11 are coming quickly. Get excited.
By Rohun Saxena, Aneesh Pappu, Brexton Pham, and Josh Singer