Augmented Reality at Landmarks: ARKit+CoreLocation in Swift
Position augmented reality objects on real-world locations so users can move and see virtual pins floating on landmarks
This article will explain how to:
- Place AR pins over landmarks
- Keep the AR pins on their landmarks as the user moves
- Avoid the pitfalls of ARKit and its quirks
To show you how it works, we made a video while doing beta testing of the app we designed and developed for the popular ski resort in North Vancouver.
Just Give Me the Code
If you just want to get to the code, take a look at this framework I made, which shows how all the pieces for making ARKit+CoreLocation work together. Alternatively, you could take a look at Project Dent, which is a popular framework for doing this kind of thing.
Combines the high accuracy of AR with the scale of GPS data. - ProjectDent/ARKit-CoreLocationgithub.com
If you are like me, and find that Dent doesn’t quite do the trick for you (or you just want to know a little more about how to do this ARKit stuff) then, Hello! I hope you enjoy this short read.
Let’s take a look at the different components that need to work together so that we can do this. As a reminder, our goal is to position virtual objects on top of real-world locations. We also want the users to be able to walk around. If walking around isn’t important, there are easier ways to do this (you can check out Yat Choi’s article on Waypoints, for example).
Part 1: ARKit
ARKit provides all the functionality for placing our virtual objects. Device orientation, rotation, etc. are all managed by ARKit. All we need to do is configure it, like this:
let scene = SKScene(size: view.bounds.size)
let arSKView = ARSKView()
let configuration = ARWorldTrackingConfiguration()
configuration.worldAlignment = .gravityAndHeading
arSKView.session.run(configuration, options: [.resetTracking])
If you want to know more about configurations, you can check out the documentation.
When we start
ARSKView’s session, it creates a virtual world centered on the device’s current location. All virtual objects we place will be positioned relative to that location. So, we can’t just hand our session a GPS coordinate for each landmark. We need to find the transformation that describes the location of the landmark relative to the AR world origin. You can take a look at the
makeARAnchor function in my gist to see how to do that.
However, there’s 2 problems.
- For some reason, ARKit won’t render anchors that are far away (more than ~100 meters) from the world origin. So, even if we properly transform our landmarks from real-world coordinates to session-relative locations, they won’t be visible if they are too far away. Well, that’s not too bad. We can check if the location is too far, and just project the pin closer. That leads us to the second problem.
ARSessiondoesn’t update the world origin as you move. So, if you project the location to be closer to the user (like, if you put the summit of Mount Everest 75 meters away), then if the user walks around they’ll find that the location isn’t where it should be (Mount Everest might be just down the street. Who knew?).
These problems can be solved by using CoreLocation a bit more.
Part 2: CoreLocation
If a landmark is more than 100 meters away and we put the virtual pin closer than that, then we need to be able to move the pin further away if the user approaches it. We can do this by recentering the AR world’s origin. So we’ll register for location updates, then check if the user is more than some distance (say, 10 meters) from the AR origin. If they are, we’ll do 3 things:
- Remove all the current landmark pins
- Restart the
ARSessionso that the world recenters on the user
- Place all the landmarks again
And that’s it! All your AR content will update as the user moves.
Where to go from here
Once all the basics are set up, there are a lot of cool things you can do. Here’s some example of things that I’ve been able to get working:
- Do your own pin (aka
SKNode) scaling, instead of just using ARKit’s
- Set the zPosition of
SKNode's relative to their distance from the user, so that pins that are further away appear behind pins that are closer.
- Subclass SKScene and use the
touchesBeganmethod to be notified when the user taps a pin.
- Periodically check for overlapping pins, and only show the closer ones, etc.
Skyler Smith and FreshWorks Studio
Skyler Smith is the lead iOS developer at FreshWorks Studio, working with other experts to design and develop elegant and highly functional mobile, web, and blockchain apps. Our 50+ person team is based in Victoria and Vancouver and we work with bold entrepreneurs, startups, enterprises, and government organizations.
Contact us anytime to find out how we can help your organization succeed.