Designing effective instructional UI for complex AR tasks.
Over the last three months at HEXR, one of our goals is to reduce the strain on customer service caused by an increase in people using our iOS fitting apps. We knew from previous discussions that there were many issues caused when people would attempt to use our AR features to send us head size data, so we began to investigate what was going on.
Using our in-app analytics, we confirmed that ~45% of people trying to complete an in-app fitting were abandoning it at some point. We looked into more analytics data to uncover whereabouts in the customer journey issues were occurring. Our event-based metrics highlighted that the most common task abandonment point was within the first few seconds of starting the AR feature, a startling 65%. Here’s a visualisation of that behaviour flow.
Analytics could only take us so far as we had no idea of our customer’s intentions at the point of loading the feature. To understand why this behaviour was occurring, we conducted a usability study. We recruited people who were in the middle of the customer journey, offering them a voucher or partial refund on their order to participate in the study. With eight people willing to help us, we asked them to complete a few tasks using the app using the think-aloud method to better understand their perspective. We then followed up with some question such as;
- Can you follow the in-app instructions to complete a fitting?
- Can you teach me how to put the fitting cap on?
- After you had finished watching & reading the instructions, how did you feel about the task?
- Was there any time where you were unsure what was needed from you?
- When you were unsure what to do, what did you do?
Upon conducting a thematic analysis of the usability data, we discovered that once the AR feature had launched, people had no recollection of the instructions they were given just a few moments before. When following up with them after the test tasks, some people recalled that there was “too much information at once”, one participant highlighted that “there were no playback controls for the video instructions, and I missed quite a bit of information because of it.” As a result, the general word to describe the experience was “rushed”.
Some other common themes were:
- A lack of indication if there were any further instructions after pressing ‘next.’
- Unfamiliar terminology that isn’t introduced anywhere.
- The inability to rewatch instructions caused abandonment.
After some expert analysis of the feature, it was clear that the current user flow could easily cause information retention issues. In the video instructions, there were approximately nine instructions that we gave users throughout this video. Of course, this far exceeds the general 7±2 rule is broken. There is clearly a lack of progress indicators and, therefore, a lack of the features state.
We presented these findings to the team in a story format using our personas and journey maps derived from the usability observations. After we’d explained the problem, I led a design workshop to collaboratively create possible solutions. This followed the LeanUX design studio format of:
- problem definition & constraints
- individual idea generation
- present & critique
- iterate and refine in pairs
- team idea generation
We spent approximately one morning working like this and had several ideas to further refine and test by lunchtime. These included:
- Breakup instructions and present them at the required time
- Present instructions when customers attempted to abandon the fitting process.
- Show a more detailed progress bar that indicates the steps.
We refined these ideas and repeated the usability test to confirm if this was a more effective method. To get the most comparative data possible, we recruited the same amount of people using the same techniques and, where possible, gave the same test tasks.
To reduce the design test cost, we conducted a wizard of oz style test where we removed the app’s instructions and provided them side by side as the users carried out actions in the app. This significantly reduced the complexity of the upfront engineering work that was required to validate the ideas.
After the usability tests were completed and behaviour metrics were derived from the observations, we could clearly see improvement. There was a 45% increase in the number of people completing the fitting tasks the first time, of which 60% were completing the fitting the first time.
With this evidence, we felt comfortable refining these design ideas. We developed them into polished designs that could be handed off to the engineers. This included creating state variations for errors and alternative viewing methods and redline diagrams to illustrate sizing and margins.
As this update is still in development, we can’t see a tangible benefit to the organisation. We expect to see a less significant impact on the analytics as we do in the usability data due to users unclear intentions when using this feature; for example, users like to investigate the app and its features before buying to understand and evaluate the product.
Overall, this project took approximately 1.5 months of work, with most of the time spent on research. This highlights our most significant challenge to scaling our design strategy at HEXR, to make research quicker to schedule, recruit, and analyse.