A Quick Guide to Designing for Augmented Reality on Mobile (Part 1)
This article is Part 1 of an ongoing series, catch up on Part 2, Part 3 and Part 4.
I have spent the past few years researching and analyzing the future of design tools. During my time understanding how designers build and think, I started noticing a trend. I observed that designers were having a hard time articulating their intentions as to what they wanted to make for AR. It was difficult to explain and even harder to understand what the possibilities could be. I often heard designers say:
“My boss wants me to make an AR thing, what should I make?”
It’s understandable with so much hype around AR/VR in the past few years that it’s on everyones radar. However what is not okay here is the approach. There is no ‘AR Thing’. There are problems that need solving, and these problems have constraints that may be overcome by thinking spatially. The first step in finding out if AR is the right medium is by identifying the users and their needs.
Do these problems involve immersing the user in real time, assisting them in space or physically engaging them? Are there physical constraints that currently prevent them from being successful? If so, then there’s a good chance that Augmented Reality can add value to the solution.
One of the superpowers of AR is knowledge transfer: If you compare the theory of gravity with black holes, Theoretically we are more knowledgeable about gravity because we can experience it as opposed to a black hole which we can only observe.
Having your users experience rather than observe may sharply increase the chances of them understanding and retaining information. It’s what makes AR such a compelling medium for education and training. Not to mention the ability to be free of any physical limitations and restrictions.
AR also has great potential for marketing since it involves having the user completely immersed in the experience: It is a known metric that full immersion/engagement leads to a higher rate of conversion. A user is more likely to make a purchasing decision once they have tried out the product by themselves.
Affordances and Constraints
Augmented reality is a digital extension of product design. The same principles of thinking apply with a few modifications. Rather than having physical constraints, the user now has technological constraints and affordances.
This means the world no longer binds the user, however, they are still limited by material constraints determined by the technology. A great example is an older generation iPhone without a motion or depth sensor versus a newer model with that technology.
The earlier models are unable to calculate the depth data necessary for a smooth experience therefore it is a constraint, however, an affordance is using the camera data to simulate or calculate a plane.
Although not even close to the accuracy of a sensor, this is an excellent example of how valuable thinking beyond the current technological capabilities is.
Hardware is generally easier to scope and predict than user behaviors. It is vital for designers to explore beyond current technological constraints so they can help lead the technology forward.
Language plays a critical role when defining your experience. The following are examples of some of the more popular content types used within AR.
- Static: Content that is still and lacks movement and interaction
- Animated: Content that moves on a timeline or follows a sequence
- 3D: Content with width, height and depth or data with XYZ coordinates
- Dynamic: Adaptive content that changes with interaction or over time
- Procedural: Content generated automatically or algorithmically
These content types are not exclusive and can combine in many different ways. However, it is essential to understand these formats so the designer can properly articulate what they are trying to do. For example, for a design that requires a vase to reveal a price tag upon clicking: The vase is a dynamic 3D object that exposes a static tag. If the experience then involves clicking on the tag and making a purchase, the tag now becomes dynamic.
When mapping out behaviors and relationships in AR, it is helpful to be specific about where and how to treat the content. Try to be as precise in describing the experience to get alignment amongst stakeholders.
A good rule of thumb is to call out the location (e.g., glass, space, object…), the content type (e.g., static, 3D…) and the state of content (e.g., fixed, locked, flexible…)
STATIC & FIXED ON GLASS
This interaction has a static graphic overlay fixed to the glass(screen) at all times. This design convention is useful for permanent elements that need to be within the users reach at all times. An example of this is a menu or return prompt.
STATIC & LOCKED IN SPACE
Although these elements are locked in space, They could have a dynamic feature where they always face the user. This design convention is useful for labels and material that needs to accompany an object or marker in space.
DYNAMIC & FLEXIBLE ON GLASS
In this case static becomes a dynamic content type .This convention works for allowing users to position assets in custom or specific areas. This is helpful for target based or drag and drop elements.
DYNAMIC 3D & FLEXIBLE IN SPACE
A great way to engage with 3D models and understanding its components. Most commonly used for educational purposes and understanding the breakdown of an object.
DYNAMIC 3D & PROPORTIONATE IN SPACE
Helpful when allowing a user to see an object in an actual environment with lighting and measurement considerations. Often used in commerce platforms.
In part two I discuss specific design patterns and considerations for rotation, position and translation. Thank you for reading!
Interested in learning more about how things work, read Designers Guide to Hardware and Software for AR.
Special thanks to Stefanie Hutka and Brendan Ford for their review and insight.