How to Prototype and Test Vision Pro AR Apps on iPhone Without Code

Amir Khella
AUGMENTOP
Published in
9 min readJul 9, 2024

This article originally appeared on Augmentop. You can read the full version here.

How long would it take to go from an idea for an Augmented Reality app to a prototype that you can share with your team members or clients, or test with your target audience?

A month? A week?

How about 5 hours?

The video below shows an interactive AR prototype for a Vision Pro app running natively on iOS (no AR headset required). It was created in less than a day using freely available tools, and without writing a single line of code. The prototype does not require any app downloads to run, and can be shared with others using any cloud service.

Vision Pro app prototype running natively on iOS

If you have an iPhone or an iPad, you can test the prototype in AR mode here. No app download or installation is required.

In the next few minutes, I am sharing my step-by-step process for creating that prototype, as well as the project/source file that you can use as a starting point for your own prototype.

But first, let’s take a few seconds to examine the current AR and spatial computing landscape.

The Current Spatial Computing and Augmented Reality Landscape

With the Vision Pro, Apple has ushered a new age of spatial computing for consumers, and creating engaging augmented reality applications for those devices will quickly become a competitive advantage for various industries.

However, creating AR apps still requires specialized design skills and programming experience, an extensive amount of time to develop those apps, and an expensive headset to preview and test them.

To save time and resources, you need to be able to:

  1. Prototype AR apps quickly and cheaply
  2. Test them with users early and often, and without having to have an AR headset
  3. Use tools that you are already familiar with, or that are easy to learn, instead of spending weeks learning complex tools and programming languages

There are currently 3 options for designing and prototyping spatial interfaces and augmented reality applications:

  1. Using a UI design tool like Figma, which provides an advantage for people who can already use it, but can only preview those prototypes on a 2D screen, instead of their actual environment (AR)
  2. Using Apple’s development tools, such as XCode and Reality Composer Pro, but you can only preview those prototypes on a Vision Pro

3. Using 3D/game creation tools like Unity and Unreal Engine, which are not only complex, but also require special deployment techniques to be able to test the prototype on target devices or phones

OUR SOLUTION

We have been using a new technique for the past couple of years to create Augmented Reality business cards, and QR-code-driven AR experiences for books, magazines and print marketing.

Augmented Reality business card

This technique allows us to prototype interactive AR apps in less than a day without writing code, and test them natively on any iOS/iPadOS device without having to download any apps, which has been very useful for sharing those prototypes with our clients, and for testing them with end users.

We have created step-by-step tutorials for how to create those experiences here.

This workflow has been used by thousands of designers and developers, entrepreneurs and founders of AR startups, and product managers in large companies who are extending their solutions to augmented reality, and it is similar to the one I used to prototype web and mobile apps with Keynote more than a decade ago.

Today, I am showing how we have extended this workflow to prototype various types of Vision Pro apps in the same amount of time, and sharing all the steps involved in that process, as well as a sample project file to use a starting point for your prototypes.

The only tools you need are:

  1. Figma (Free)
  2. Reality Composer (Free)

Important Note:

Apple has removed Reality Composer from newer versions of XCode, and replaced it with Reality Composer Pro, which requires XCode to create AR prototypes, and a Vision Pro headset to test them.

The original Reality Composer is still available for free for iOS and iPadOS, which works great.

If you would still like to use Reality Composer on MacOS, follow these steps:

1- Login to your Apple Developer account, or create a new one
2- Navigate to the downloads section, locate
XCode 15, and download it
3- Install XCode
4- If you try to run XCode 15 on newer MacOS versions, it will not allow you to. Instead, right-click XCode, click “Show Package Contents” from the context menu, navigate to Contents/Applications, and double click open Reality Composer.

Reality composer is like a 3D version of Apple Keynote, and we hope Apple would bring it back through the Mac app store, as it is an extremely valuable tool for prototyping spatial apps and creating quick interactive augmented reality experience without code.

Here is our step by step process for prototyping AR apps and spatial interfaces using these tools.

If you prefer watching instead of reading, I will be adding a full video tutorial here shortly. Follow me on Medium, X/Twitter or LinkedIn to get notified when the video is available.

Step 1: Designing the UI in Figma

Using Apple’s Vision Pro UI kit for Figma, create different UI screens for your app.

Here are some sample screens that we have created for the social media AR prototype demoed in this video:

Login screen
Feed screen
Profile screen
Post screen
Compose screen

Step 2: Exporting the UI from Figma

The next step is to export your UI from Figma in a structure and format that we can use in Reality Composer.

To do so, you need to select each high level UI screen and component, and export it individually as a PNG with transparency. This includes individual UI screens, as well as any toolbars, tab bars, popup dialogs, etc.

If a UI component has multiple states (e.g. expanded and collapsed), you need a separate export for each state.

Exporting spatial UI screens from Figma

Step 3: Importing the UI into Reality Composer

Next, we need to import those exported UI images into Reality Composer by following these steps:

1. Create a new Reality Composer project

2. Create a new scene for each UI screen (login, feed, post, profile, etc.)

3. Drag and drop the exported UI images onto that scene

4. Adjust the position and rotation of each element to face the camera and to match its original location in the Figma design file. Make sure that the UI components are placed high enough relative to the origin, and in similar spots across scenes, so that they won’t be jumping from one spot to another as users navigate between them.

5. Repeat steps 2, 3 and 4 for each UI screen

This is the most repetitive and time consuming part of the process. If we find/create a faster way to do it, I will update this tutorial accordingly.

If you are using Reality Composer on iOS or iPadOS, you can use AR mode to validate that the UI components are placed and sized properly before moving to the next step.

When you are done importing your spatial UI from Figma into Reality Composer, you should have something that looks like the following screenshot.

Importing UI into Reality Composer

The red circles (also created in Figma and exported as transparent PNGs) are interaction hot spots that we add to the AR prototype to guide clients/users on where to tap to interact with the prototype, and that we use to link the AR scenes together in the next steps.

Step 4: Adding interactivity

If you are among the 100,000+ people who have been using Keynote to prototype web and mobile apps, this next step will sound familiar!

After creating the spatial UI layouts in Reality Composer, you need to link those scenes together so that when a specific hot spot is tapped in one scene, it would open the scene corresponding to how the AR app would respond to that specific input.

To do so, you need to:

  1. Place hot spots where the user would interact with the spatial UI
  2. Add a Tap behavior for each hot spot to change the scene to the next one
Linking between UI screens in Reality Composer

Once all your scenes are linked together, you can test your spatial prototype by clicking the Play button in your project.

If you are using Reality Composer on the iPhone or iPad, you can also test the prototype in AR mode and interact with it in your room.

Step 5: Adding animations

No spatial interface is complete without some beautiful animations to keep users in context as the UI switches from one state/scene into another, and this is probably the most fun part of the process, as Reality Composer really excels at this stage.

We will be using the same Behaviors panel in RC to add our animations, and we will be adding two kinds of animations:

1. Scene start animation

2. Tap/Scene exit animations

At the start of each scene, add a behavior to hide all scene elements with 0 seconds duration, and then add action to show each UI element with a different animation.

For instance, the example below shows the social media feed screen containing 3 UI elements (tab bar, toolbar and main screen), animating from left, bottom, and rear respectively with 0.3 second duration.

Adding UI animation in Reality Composer

The second type of action is when a specific UI hot spot is tapped on the screen, and in this case, two actions are added:

1. Hide the UI screens with a reverse animation
2. Change the scene to the next one in the interaction sequence

Adding UI animations in Reality Composer

I highly advise testing each animation individually, and then all the animations in a specific scene fully, before moving to the next scene.

As with the previous steps, you can use AR mode on your iPhone or iPad to test the spatial prototype in your room.

Step 6: Exporting, testing on mobile devices, and sharing with others

To share your prototype with others (clients, users, investors, etc.) who may not have Reality Composer installed, you can export it as a Reality file and share it with them via email, messaging or any cloud service of your choice.

The .reality file can be opened natively on iOS or iPadOS version 13.0 or later.

Make sure you export the entire project, rather than the current scene only, and as a Reality File, instead of a USDZ file, which currently does not support user interactions.

Exporting Spatial UI prototype from Reality Composer
Testing Vision Pro app prototype on iOS

YOUR TURN

  • Click here to open the prototype created above and interact with it.
  • Click here to download the Reality Composer project file to learn from, modify and use as a starting point for your spatial interfaces

WHAT’S NEXT

AR provides a new platform for designing spatial interfaces, with new opportunities, challenges and design rules, and we have used the technique outlined in this tutorial to prototype different types of augmented reality applications for business, education, data visualization, entertainment, finance and even games.

As we are experimenting further and discovering more tips for how to design for spatial computing, I will be sharing everything we learn here. If you would like to get notified when new tutorials, resources and tips are available, you can subscribe to our newsletter.

--

--

Amir Khella
AUGMENTOP

Entrepreneur, product designer, and consultant. Helped 15 startups design+launch (5 acquired). Founder of Keynotopia and Augmentop. 100K+ customers.