Re-imagining Price Check with Augmented Reality
Walmart store shoppers love using our mobile app barcode scanner as a price checker. Our team sees the potential of this product as so much more, though. When a customer launches the scanner, they get a direct connection between the digital and the physical world that their screen and camera lens creates for them.
That’s the magic of augmented reality. Making the real world more valuable is an incredibly powerful communication medium because people instinctively just, “get it.” There’s very little learning curve.
Last spring, our team won an internal Walmart hackathon thanks to our compelling explorations using Apple’s ARKit to help customers save time and money. We were so excited about what we’d learned, we challenged ourselves to see if we could use this technology to improve upon our most frequent and relied-upon use case: scanning barcodes to look up products.
After several ideation sessions, we gravitated to an idea we believed could solve three customer problems we haven’t been able to successfully solve until now:
- How might we make the scanning experience not only load products faster, but also feel faster?
- What would happen if we made it easy to compare products by more than just price?
- If we make scanning more fun, will customers be more inclined to use it?
The Perception of Fast
We had a hypothesis that a significant amount of load time after scanning an item actually wasn’t from typical performance focus areas such as network latency and JSON deserialization.
When a user scanned a product, we took them out of the experience and pushed a new UIViewController onto the navigation stack. We considered that the animation time of this transition coupled with a subsequent network call to load related products could be a major bottleneck.
We tested this hypothesis with debug tracing and found that if we could eliminate this transition, we could reduce the time it took to enable a user to scan another item by an average of 50%.
But First, A Design Reboot
One disadvantage we needed to overcome was that the design needed a complete overhaul. The legacy experience was mired by excessive chrome. The reticle was a small rectangle of negative space serving a small window into reality.
We knew that if we wanted to create an experience that felt fast, we needed to make the content the focal point and strip away any other visual elements that created cognitive strain.
The result is an interface that extends to the edges of the display encouraging you to focus on what you’re scanning in the real world and less on digital ornamentation.
When you’re shopping at a leading low-price retailer with an immense product selection, it can be challenging to decide which product to buy from a category when the prices are very similar.
Imagine you’re on a quick shopping trip because you’re out of diapers and you need to make a split-second decision. You need to pick high quality diapers for your child in the car. Your personal well-being could depend on it! You could buy the diapers that are the least expensive, but how do you know they are of acceptable quality? If you spent just a little bit more money, could you get diapers that are of significant higher quality?
We designed the AR Scanner to anchor dots to what you’ve scanned in reality. As you pan your phone between products you’ve already scanned, the bottom product tile updates based on which anchor you’re pointing at.
The result is something that feels like an extension of an interaction you already do today. As you pan your phone around looking at a row of products, the product at the center of your gaze is updated below giving you a way to very quickly compare by price and product reviews.
You really have to see it to believe it. Hint: Watch the video above.
This approach also solved another user interaction problem that is common amongst AR interfaces: clutter. By using smaller dots as opposed to anchoring the entire content to the product itself, we don’t have to worry about content occlusion when multiple products are scanned in a very close space.
We designed our AR Scanner as an opt-in alternative to how we’ve traditionally approached scanning. At Walmart Labs, we take plenty of risks, but we’re constantly measuring and evaluating data to ensure our risks are controlled and calculated.
In this case, we’re starting with iOS first. Since ARKit can only run on iPhone 6s and newer, and we needed to use some features in ARKit 1.5 which requires at least iOS 11.3, we’re naturally starting with a limited audience.
If you’d like to take the AR Scanner for a spin, download the latest version of the Walmart iOS app (version 18.20 and above) and open the scanner by tapping the barcode icon in the search bar. Once the scanner has loaded, tap on the AR icon in the navigation bar to start the AR Scanner. It’s the one on the far left that looks like a 3D box.
We have a big vision for the future of scanning with augmented reality, and we’re excited to be taking this first step towards many exciting things to come.