Concept on how Apple can push Augmented Reality and Siri together with iOS 12

Albert Choi
7 min readSep 29, 2017

--

It begins with a new app and developers kit.

Craig Federighi Presenting ARKit at WWDC

Apple and Augmented Reality

In June, Apple released ARKit along with iOS 11 at WWDC. This was Apple’s first investment into augmented reality technology and gave developers tools to build AR based apps. What was very surprising to me was Apple provided tools to build AR apps but didn’t release any of their own. You can make a case that they put developers first. However I believe it’s Apple being more cautious in entering the AR market. Augmented reality is new to consumers but there is unlimited potential. Now it’s unclear what Apple’s next move is but my best guess is that in iOS 12 they’ll make a full investment in augmented reality by using their own native apps. Because if it’s anyone to successfully pull it off, it’s Apple.

I’m very interested in the AR space. Thinking like a Product Designer at Apple, I wanted to build a concept that provides developers the opportunity to showcase their AR apps to consumers at ease while also integrating Apple’s own native apps. I also thought Siri and AR is a perfect complement so here’s what I came up with:

Concept of Lens App & LensKit

Lens is the app that would ship with iOS 12. It will be the foundation and platform for Apple’s native apps and third-party apps that use augmented reality. Lens is an amazing platform for developers to showcase their AR based apps without users needing to download the full app.

SiriKit and ARKit are official developer tools made by Apple

LensKit is the developer tool that is used to build on the Lens app. LensKit will help developers create or merge an existing app into the Lens app. Developers will have full control over what features will be available using the Lens app. To take full advantage of the Lens, developers will need to work with SiriKit and ARkit.

Lens App

The app will be preloaded with the newest iOS and will be available in the home screen.

You can also open Lens with certain features using Siri.

When first opening the app, you’ll be greeted by a quick introduction about the app.

Once you’re in the app, the camera will enable and a menu bar will appear. There will be three buttons: Explore, Siri, and Apps.

Explore will show you top apps for Lens and preview apps. Download and quickly preview and demo an app.

Siri works the same way as always. Use your voice to navigate and command through Lens.

My Apps will show you all the apps that you’ve loaded onto Lens. If an app that you already have on your iPhone, it will migrate automatically to be compatible with Lens.

Maps

So when you activate Siri and ask it a question like “where is some places to eat?’’, in this case Lens will use Maps to help you look for a location. Using augmented reality, restaurant icons are shown at the location. If you prefer the Maps app, it can be accessed clicking the icon on the top right of your screen.

When clicking on any of the restaurant icons, you’ll see the information about the restaurant. With Lens, you’ll be able to receive AR based directions when driving (for passenger) or when walking. Maps will be able to provide you compatible indoor routes.

Third-Party Apps

Developers will have options to either build a preview of their app on Lens or fully integrate it with Lens. For users, they will have the freedom to try the app before they decide to download it.

IKEA Place App

I decided to use IKEA Place as an example of a preview app. The preview will give users a glimpse of the app.

When opening the app for the first time on Lens, you will be greeted with a short detail of the app and instructions.

So the objective is to place a Lack table in your home. You will be able to use your finger or use Siri to place the table. In this case we’ll tell Siri to place the table for us.

Once the table is place, you’ll have some features on the display. Starting with the bottom right corner is the IKEA Place app. You can click here to download the app from the App Store or if you already have it installed, it will open the app. The bottom left is your help button to find support or information about the app. Top left is the redo button to redo the placement, customization or adjustments. Top right is an option to change the color of the table. Top middle will feature a screenshot button to save to your camera roll. The next 2 slides will show color options and a change in table color.

Night Sky App

Another example is using the Night Sky app — an app that shows you a map of space!

You will be able to tell Siri to open the app and give the task. Once finished it will give you an option to download from app store.

Lastly, Pokémon GO

Had to based on my previous design posts. With Lens, you will be able to ask Siri for assistance to find nearby Pokémon.

Looks like a wild Squirtle appeared! Using Siri, you’ll be able to tell it to catch the Pokémon.

Once Squirtle is caught, you will have the option to open the app.

Conclusion

I believe that this would increase the engagement of AR apps. Augmented reality has unlimited potential however still extremely new for most consumers. Providing users with a demo/preview of what the possibilities of augmented reality based apps would provide a great experience. Siri would become more important with the evolution of AR. Touching the screen should not compromise the use of AR apps and that’s why I do believe voice commands are crucial. I’m very interested to see what Apple does next with augmented reality.

Thank you for your time,

Albert

--

--