Augmented Reality at Landmarks: ARKit+CoreLocation in Swift

Skyler Smith
Sep 25, 2018 · 4 min read
Image for post
Image for post

Position augmented reality objects on real-world locations so users can move and see virtual pins floating on landmarks

  • Place AR pins over landmarks
  • Keep the AR pins on their landmarks as the user moves
  • Avoid the pitfalls of ARKit and its quirks

At FreshWorks Studio we recently developed an app for Grouse Mountain that used this functionality (available on the App Store).

To show you how it works, we made a video while doing beta testing of the app we designed and developed for the popular ski resort in North Vancouver.

Beta Testing an AR App for Grouse Mountain Ski Resort

Just Give Me the Code

If you just want to get to the code, take a look at this framework I made, which shows how all the pieces for making ARKit+CoreLocation work together. Alternatively, you could take a look at Project Dent, which is a popular framework for doing this kind of thing.

If you are like me, and find that Dent doesn’t quite do the trick for you (or you just want to know a little more about how to do this ARKit stuff) then, Hello! I hope you enjoy this short read.

The Parts

Let’s take a look at the different components that need to work together so that we can do this. As a reminder, our goal is to position virtual objects on top of real-world locations. We also want the users to be able to walk around. If walking around isn’t important, there are easier ways to do this (you can check out Yat Choi’s article on Waypoints, for example).

ARKit provides all the functionality for placing our virtual objects. Device orientation, rotation, etc. are all managed by ARKit. All we need to do is configure it, like this:

let scene = SKScene(size: view.bounds.size)
let arSKView = ARSKView()
arSKView.presentScene(scene)
view.addSubview(arSKView)
let configuration = ARWorldTrackingConfiguration()
configuration.worldAlignment = .gravityAndHeading
arSKView.session.run(configuration, options: [.resetTracking])

If you want to know more about configurations, you can check out the documentation.

When we startARSKView’s session, it creates a virtual world centered on the device’s current location. All virtual objects we place will be positioned relative to that location. So, we can’t just hand our session a GPS coordinate for each landmark. We need to find the transformation that describes the location of the landmark relative to the AR world origin. You can take a look at the makeARAnchor function in my gist to see how to do that.

However, there’s 2 problems.

  1. For some reason, ARKit won’t render anchors that are far away (more than ~100 meters) from the world origin. So, even if we properly transform our landmarks from real-world coordinates to session-relative locations, they won’t be visible if they are too far away. Well, that’s not too bad. We can check if the location is too far, and just project the pin closer. That leads us to the second problem.
  2. ARSession doesn’t update the world origin as you move. So, if you project the location to be closer to the user (like, if you put the summit of Mount Everest 75 meters away), then if the user walks around they’ll find that the location isn’t where it should be (Mount Everest might be just down the street. Who knew?).

These problems can be solved by using CoreLocation a bit more.

If a landmark is more than 100 meters away and we put the virtual pin closer than that, then we need to be able to move the pin further away if the user approaches it. We can do this by recentering the AR world’s origin. So we’ll register for location updates, then check if the user is more than some distance (say, 10 meters) from the AR origin. If they are, we’ll do 3 things:

  1. Remove all the current landmark pins
  2. Restart the ARSession so that the world recenters on the user
  3. Place all the landmarks again

And that’s it! All your AR content will update as the user moves.

Image for post
Image for post

Where to go from here

Once all the basics are set up, there are a lot of cool things you can do. Here’s some example of things that I’ve been able to get working:

  1. Do your own pin (aka SKNode) scaling, instead of just using ARKit’s
  2. Set the zPosition of SKNode's relative to their distance from the user, so that pins that are further away appear behind pins that are closer.
  3. Subclass SKScene and use the touchesBegan method to be notified when the user taps a pin.
  4. Periodically check for overlapping pins, and only show the closer ones, etc.

Skyler Smith and FreshWorks Studio

Skyler Smith is the lead iOS developer at FreshWorks Studio, working with other experts to design and develop elegant and highly functional mobile, web, and blockchain apps. Our 50+ person team is based in Victoria and Vancouver and we work with bold entrepreneurs, startups, enterprises, and government organizations.

Contact us anytime to find out how we can help your organization succeed.

FreshWorks Studio

Your Technology Partner

Skyler Smith

Written by

iOS Developer since 2015

FreshWorks Studio

FreshWorks Studio is ranked as the #1 Canadian app development company. We design and develop mobile and web applications driven by user experience.

Skyler Smith

Written by

iOS Developer since 2015

FreshWorks Studio

FreshWorks Studio is ranked as the #1 Canadian app development company. We design and develop mobile and web applications driven by user experience.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store