The Reality Side of AR

e-legion
e-Legion
Published in
3 min readAug 23, 2018

The excitement so far with ARKit has centered on making AR objects react to the real world. How about making AR objects part of that world?

In his talk, Alex Curylo from Agoda.com explored the use of Core Location to create an AR experience anchored on real-world locations. The talk shows that the latest hype for ARkit should serve as a trigger to make a useful application highly integrated into the real life.

Read the overview, watch the video and enjoy the useful links Alex has shared.

Everybody wants to change the world

There are three types of apps using AR:

· Good apps that do the thing that ARKit, as it is, is good for. If there is an empty space with planes and a well-defined object and you move it around then that is a useful app. For example, interior design apps.

· Bad apps that like rulers or the object apps like shark where it swims around or drawing apps where you wave your hand around and make a 3D-object. These are funny but not for real life.

· The ugly apps that require too much effort and do not make any difference.

But it’s going to be a party trick until the reality side of AR (which I stole that the title of this from) is more than just a backdrop.

Steven Johnson, pop-science author.

Putting this even more strongly, as smartphone apps are our «default» UX, then your AR App should only exist if it does something that can only be done in AR.

Matt Miesnieks, CEO SuperVentures.

The reality side of AR

Three frameworks that developers need to understand to get started with AR:

  • Core ML will assist with scene content;
  • Core Location to define the indoor and outdoor location of the object;
  • AR Cloud.

Object aware models

There you could watch an AR demo in which when you click on an object, its description obtained from the neural network appears.

The following steps should be done to create a demo:

  • Create a SKScene;
  • Load Inception .mlmodel;
  • Pass in the video feed;
  • Tapping an Aranchor will add the current top model result as a SKLabelNode.

AR world: Core Location outdoors

There is a demo where the screen shows the names and direction to the various objects on the map.

The following steps should be done to create a demo:

  • Create an ARSCNView;
  • Trigger location tracking;
  • Perform a MKLocalSearch on stock queries;
  • Add a SCNNode collection with the results.

Core Location indoors demo

Core location provides services for determining devices geographic location, altitude, orientation, or position relative to a nearby iBeacon. The framework uses all available onboard hardware, including Wi-Fi, GPS, Bluetooth, magnetometer, barometer, and cellular hardware to gather data.

There is a demo of simple AR-game where you could interact with objects from the real world.

The following tools were used to create a demo:

  • beacons;
  • ARCloud;
  • Proximity manager.

AR-usage examples

Some ideas of AR-projects:

  • AR-guided airport — navigation inside the airport, information about gate changes and delay.
  • Luggage-trackers — get AR direction to your luggage.
  • Commercial navigation.

MBLT DEV 2018

We have a chance to discuss new features at MBLT DEV, occurring on the 28th of September in Moscow. Grab your ticket now: the closer the date the higher the price!

Useful materials that help you to work on AR project:

https://githum.com/likedan/awesome-coreml-models

https://developer.apple.com/arkit/

http://www.madewitharkit.com/

https://github.com/ProjectDent/ARKit-CoreLocation

http://www.estimote.com

https://github.com/alexcurylo

--

--

e-legion
e-Legion

We build apps for businesses and organise events for developers.