AR Magazine, bringing print to life at Domain.

Nikki Vergracht
Tech @ Domain
Published in
7 min readFeb 8, 2019

Intro

Technology is moving fast. All companies, especially those with a tech focus must keep innovating. But how can we balance existing product backlogs extending years against the innovation necessary to thrive?

At Domain we divide our time in a 60/30/10 rule. This means 60% new development, 30% core work or architecture and 10% on innovation. As part of this 10% innovation we organise company wide events themed around innovation. We do so a few times a year during what we call “Innovation Day”. Although the name might be a bit misleading as the last few innovation days took place during 3 to 4 days!

The prize announcement during the last innovation day.

An innovation day project could be anything from a pitch to a working prototype and does not necessarily have to do anything with tech. On the final day your team presents its findings and tries to impress the rest of the company through a series of presentations. Of course there are prizes to be won.

Don’t be mistaken though, every single idea or prototype is valuable for the company and looking back at previous Innovation Days there are many prototypes and ideas that made their way into our final products!

Our story

Last December, our little squad came up with the idea to build an immersive magazine. We got our inspiration from a post on twitter displaying an issue of National Geographic featuring an Augmented Reality cover and thought it’d be cool for us to investigate the possibilities for Domain Group.

Initially we decided to build a prototype that displays static content and videos in Augmented Reality hovering over a few properties in the latest version of our housing and lifestyle magazine using Apple’s ARKit framework.

The team and I had very little crossover time since I was working from a different timezone, being overseas visiting family and friends. But it was enough to come up with a first working prototype after just 1 day. The prototype was very simple, showing a static image of our property gallery view in AR, integrated in the app.

In the image below you can see the static gallery image floating on top of the magazine. In this case a photo of the magazine next to the Daily Prophet from Harry Potter, another initial inspiration.

Iteration #1 — static image hovering above the page

Having set up the project in a way we could work on separate parts of the interactive experience, we set off to create various different types of AR content.

We played around with the following content ideas:

  • Video — using it as a way to introduce our team on the presentations, show ads and property walkthroughs
  • 3D models — imagine playing around with a 3D floor plan of a property, now how cool is that?
  • Last but not least, the property — content we serve in the app

On the left hand side in the photo below, you can see a part of the content we currently show for a property in the app. We wanted to create an experience offering this content to people reading the magazine, with the help of AR.

On the left the property details page on our app, on the right the same property in the magazine.

While the rest of the team was off creating content, expanding the prototype and sleeping, since it was night on the other side of the earth, I started investigating the possibilities of adding interactive content to the prototype. It didn’t take too long before figuring out we could re-use pretty much everything we have in the app without much trouble.

And off we went…

After just a few hours of coding, we made: an interactive gallery, an interactive map & street view, an enquiry view with a way to call an agent and a view that displayed local schools around the property. All of this from inside the AR experience.

Here’s the result:

Whilst I was sleeping and after another few hours coding the team was able to add: inserting an interactive magazine cover, squad introduction videos, a 3d house model. As a final touch we even included the latest Domain TV commercial to win over those marketing hearts during the presentations.

At the end of the day, all possibilities we explored were made available in the prototype with little to no time left for a presentation.

But we figured… why invest time in a presentation while we could just live demo the whole thing? Right?

So that’s what we did. And I’m damn proud of what we achieved.

The second part of this post will guide you through the technical know-how on how we achieved this. If you’re not interested in this part, scroll down to the credits to find out who was involved in this awesome prototype!

Technical walkthrough

The first and hardest part was figuring out how the tracking and the maths worked. What follows assumes you have a reasonable working knowledge of iOS development in Swift.

Apple’s ARKit makes uses of what they call reference images (ARReferenceImage), reference images are then added to a configuration (ARImageTrackingConfiguration) which is then set on an AR scene view (ARSCNView) that is set up with a SceneKite scene (SCNScene) and a set delegate to the view controller.

When a user moves his phone around and the device detects a reference image, a delegate method will be called where we receive an anchor which we can use to display new nodes with.

Setup of the scene view

Setup of the configuration and reading the reference images

Managing the delegate (the simple version)

Once the delegate gets called, we create a new scene node (SCNNode) we’re going to use to display our static gallery image. We check if the image anchor exists and create a new plane (SCNPlane) with a given size. Note that we use the physical size of the reference image here, since we otherwise don’t know how big the plane has to be, and we then apply ratios to make our nodes as big as we want them to be. We then diffuse an image view with the plane and create a new node with the plane as a geometry object.

After giving the new node the correct orientation and setting its position, we add the node as a child to the node we will return to ARKit for displaying. ARKit deals with the rest of the magic and our static gallery is nicely shown.

The plane node’s euler angles are used to set the orientation of a plane in the physical world. There are three different variables which are used to set the rotation on three different axes as shown in the picture below.

Axes system on iOS

The values are set in radians, so we’re basically rotating our plane -90º on the x axis in order to have it flat on the magazine.

If we don’t perform this rotation we end up with the objects ‘standing up’ like in the image below.

Gallery and enquiry view displayed “standing up”.

Which is obviously not ideal, but it might be that your model or plane requires you to be in a different orientation all together. So keep that in mind when you’re setting up your orientation.

Getting our view controllers and views to work inside AR

In order to get our actual views and view controllers, which is the interactive part of our demo, to display in AR we pretty much needed to do similar steps to what it took to display a UIImageView… With the little extra note that we need to add our views to a container view with a given frame size, or they might not display like you want them to display.

In our case we just had an empty view when trying to diffuse a viewcontroller’s view on a plane. This is because auto layout does not know how to layout things if it doesn’t first have a view with a frame size. And we don’t have a window it could use to calculate sizes. Took a while to figure this one out 😇

The key to getting our view controllers to work was add each of them inside a container view and diffuse those with a plane to create a node. This is the same technique as applied for static images above.

And voila, ARKit deals with the rest of the magic. The great thing about using view controllers in this way is that they already had all of the business functionality we required. So we could now call an agent inside an AR scene. The same logic applied for our map and gallery view controllers.

We had a lot of fun on this journey with AR and I for one can’t wait to see what we’ll be doing with this technology in our company.

Credits

I like to thank Domain for enabling us to use immersive new technologies. It’s a great form of culture to have and a great environment to work in.

I would also like to use this moment to introduce all the awesome people involved in the project:

Calum Gathergood (Android Developer) https://www.linkedin.com/in/calumgathergood/

Armin Ahmadi (iOS Developer) https://www.linkedin.com/in/arminahmady

Prashant Wosti (Android Developer and Drone Master) https://www.linkedin.com/in/prashantwosti

Amanda Chan (Graphic Designer) https://www.linkedin.com/in/amandajchan

And myself (iOS Developer) https://www.linkedin.com/in/nikkivergracht/

One more thing, if you’re interested to play around with cool tech like this and want to work with an epic team, then keep an eye out for open positions on our linkedin page.

--

--