I followed a rabbit to WondARland…

Here is what I discovered about Augmented Reality.

--

Since the release of iPhone X and iOS 11 Apple ’s heavy investment shows a clear intention in their vision of Augmented Reality(AR). From the moment they announced ARKit in 2017 (which opened up the the technology to a whooping 1.4 billion devices) till recently where they announced their new 2020 iPad Pro that includes LiDAR (Light Detection and Ranging) Apple has developed a rich ecosystem around this new developing technology. Technology so rich in fact, that its scanner in its array of cameras, is so advanced that it is being used by NASA for the next Mars landing Mission.

But — why so much investment?

One might argue that this investment heavily outweighs the current use of the technology. Of those 1.4 billion people, how many are actually using AR? Don’t get me wrong, there are some very interesting use cases currently in the App Store like “IKEA place”, which lets you put virtual furniture in your room, “Wanna Kicks” which lets you virtually try on sneakers or Apple’s own native Measure app, that enables you to literally measure anything by only using your device. Nevertheless it’s only handful of experiences that are available out there…

I have only been a developer for a short while, if I even dare to call myself one. My knowledge of Swift and development is somewhere between basic and advanced. Let’s just say that I have spent enough time tinkering with code to say I’m not a complete beginner anymore… As I continue my journey as a developer, I want to gain a deeper understanding of AR and it’s nuances.

I have a good friend that started learning Swift at the same time as me. We like to joke about the rabbit holes we visit while geeking out on technologies or some new framework we discovered. With the iPad Pro’s LiDAR sensor just released I thought it would be a good moment to take the plunge and try to develop something for this environment.

After reading through the Apple Documentation for ARKit and watching through some YouTube tutorials I decided I wanted to try to do something simple but interactive. However, I didn’t want to feel overwhelmed, so I set myself the challenge to add a virtual object to a real environment. From some research I learned you can import 3D models into your projects which you can create yourself by using apps such as Shapr3D, Blender or SketchUP. I am by no means an artist… If I drew my own model it would end up being an abstract unrecognisable Picasso… Not sure anybody would actually recognise that there is a virtual object… I therefore decided to use what is at hand, which in this case was Xcode’s own library presets, also known as Reality Composer which was released at WWDC in 2019. Little did I know what you can do with Reality Composer! (But more on that later…)

After looking at objects available a soccer ball caught my eye… It looked neat, so I thought: why not, let's try adding this to reality?! I exported the ball as a USDZ file which seemed to be the type of file used in the tutorials. After some more reading on AR, a couple tutorials and adapting the code to what I wanted to do, I figure it out! I added a soccer ball on top of my laptop. You cannot imagine the grin on my face…

The first small win!

And to be honest it wasn’t really that much code, only about 70 lines of codes in fact.

From here curiosity sparked! What else could I do?
I had added 1 ball… What would happen if I added 30? Can I even add 30?

I’m not joking… I actually tapped 30 soccer balls into my bedroom by simply adding another line of code.

Is 20 balls enough? No, we need more!

I probably enjoyed it way more than I should have. My girlfriend thought I was crazy when I was walking around the room adding soccer balls, because of course she couldn’t see what I was seeing through my phone. She would just see a dude walking around the place with a phone and a silly grin on his face.

Yet, the initial purpose was to make this interactive… and I wasn’t sold on that front yet. While there were plenty of objects which I could drag around on the screen, they were still just lying there.

Would there be a way to simulate some sort of physics that I could manage to program? Actually, yes! Programmatically, the best framework you could use is Scenekit as your 3D engine. Apple has done A LOT with SceneKit from what I have seen so far. It’s as simple as creating an object, adding some weight to it plus some other properties and… Boom! The physics engine will run it all for you. Could it get any easier? Well, as it turned out… Yes!

After playing around with it for a while I found out there is actually much more you can do with Reality Composer than just export objects. You can export whole scenes with behaviours attached to notifications, without any code at all. Then of course you can trigger these notifications with some code later on if you want… But the craziest thing is, you don’t need do this your Mac, you can create scenes directly on your iPad or iPhone using the native Reality Composer app.

Screenshot of Reality Composer iPad app

From here the fun continued! I attached the soccer ball to my head and made it stick there!

From there I managed to make it bounce while getting an applause effect for my hard work. Yes, YOU too could do this in 5 minutes after understanding the basics of the app.

And the crowd goes wild!

So… What did I get out of this and what do I want you to get out of this?

Anybody that wants to contribute or explore the AR environment can easily do so. It really isn’t that complicated and I can only encourage you, because it is ridiculously fun!

Are there any problems that I see with the AR experience? I would say from this small adventure that the main constrain that comes to mind is one of design. There is this famous industrial designer from Germany called Dieter Rams. You might not have heard of him, but you’ve definitely seen his work — his designs have hugely inspired Apple’s own designs for years. He once said, “good design is making something intelligible and memorable; great design is making something memorable and meaningful.”

ARKit, SceneKit & Reality Composer provide not only Developers but also users with powerful Augmented Reality tools that enable us to create memorable experiences easily enough. Making them meaningful is often the most difficult part. The most significant moments come to humans by shared experiences which is something I have yet to experience in AR.

The most significant moments come to humans by shared experiences which is something I have yet to experience in AR.

That’s not to say augmented reality apps can’t just be fun, there is always some room for some wonky app exploration. But the challenge in the technology now lies in finding ways of connecting users that are immersive,because reality after all, is a shared experience.

But the challenge in the technology now lies in finding ways of connecting users that are immersive,because reality after all, is a shared experience.

At the moment the barrier for the immersion is that we are seeing all this through the slab of glass that are our phones. Some sort of AR glasses could easily solve this problem and news report that there are products in the works, but it’s still an open question. With more consumer awareness and at some point in the future, more immersive shared experiences, I will better understand why Apple and other companies are investing so much without an immediate return.

So where to from here? I will continue working on making the soccer ball experience more interactive. How might I create this shared experience?Using multiple devices? Developing something to immerse me and my girlfriend in the experience? I have seen it’s possible, so that's all I need to know to keep going. All I can say is I’ll be spending some more time in this rabbit hole, I’m loving WondARland. I suggest you come visit too.

--

--