From Research to Roadmap

How the SmartShop team used research to build a customer centric experience

When I joined the SmartShop team in June 2019 as a Senior Experience Designer, it had been a little over a year since the BBC had run an article about the cutting-edge technology of the Sainsbury’s SmartShop app and its mobile pay journey. “Till-free!” the article headlines gushed — at first glance, I wasn’t exactly sure what I could contribute. The app had a significant amount of research performed in the years prior, and the SmartShop team were a pretty versatile and progressive bunch.

So…what was the problem? Where to begin with a product that looks so shiny and great from the outside?

During the month before I joined (April 2019), Sainsbury’s made the gutsy move of removing tills completely from their Holborn Circus location to really emphasise the push towards downloading the app, although leaving a help desk available in the corner which could also process transactions. Customers that were unwilling to download the app made a beeline for the lone cashier in the corner, creating long queues despite our promise of “skip the queues” with the SmartShop app.

By September 2019, the tills were back into Holborn Circus. While the number of convenience stores that boasted the technology continued to grow, and participation rates steadily increased, we noticed that customers were using the app once and not returning to use it again.

SmartShop circa September 2019

With these facts in mind, I started to wonder about things like:

  • Who are our customers? Do they change depending on the environment (convenience versus supermarket)?
  • What’s it really like to shop with an app which requires scanning individual products? How does this work for customers with accessibility needs?
  • Why were customers using it once and giving up?
  • Was our marketing message right? Do slogans like, “Skip the queues!”, “Till-free!” and “Scan & bag as you go” work for convenience and supermarkets alike?

I turned to my design team for help on how we could start getting answers to these questions. First, we created survey questions to ask customers and our store colleagues on their thoughts and experience with SmartShop. After a few days of chatting to our user groups in-store, we noticed a pattern in their perceptions of SmartShop, depending on if it was a convenience store or a supermarket.

Sainsbury’s local with SmartShop advertising
Sainsbury’s local with SmartShop advertising

We found that convenience store customers were lukewarm towards SmartShop and felt there was too much effort involved to use the app (onboarding, registration, etc). Also, the marketing messaging around skipping queues didn’t make sense to them when they pointed out, they could usually zip through checkouts without waiting too long. Supermarket customers however raved about their experience with SmartShop, particularly around using the handset. The awareness of using the app, however, seemed to be much lower based on our observations. Out of the 30 customers we spoke to in supermarkets — we were often met with blank stares about the SmartShop app.

Forming our research plan

We needed to dig deeper and really understand who our customers were, what their perception was of the app and handset, and what were the negatives and positives that they had experienced. Enter Vicky Coughlan, an awesome User Researcher from the Sainsbury’s research team. She proposed an ethnography study of 22 participants over a period of several weeks, where each participant would use the SmartShop app and handset and record their entire shopping cycle through a mobile diary app called Indeemo.

With Vicky concentrating on the ethnography study participants and their feedback, I turned my attention to understand the core usability of the app and mobile pay journey. I had a few assumptions collected around the customer journey and wondered about things like:

  • What are users’ expectations of SmartShop, and the requirement of needing to scan a Nectar card to get started?
  • What’s it like ergonomically holding your phone while scanning a product and also carrying a bag?
  • Would they use it based on how the app is now, and could it be their preferred way of shopping?

To get the answers to my questions I worked alongside Gen Baijan, Principal User Researcher, to flesh out the testing format we would use and the tasks we would have our participants perform. We organised the test so that it would take place over 2 weeks with 10 participants, 3 of which had accessibility needs. None of our participants had used the SmartShop app before but were Sainsbury’s customers as well as Nectar card members, and the storyline of the test was:

  1. They were first time users — so they needed to go through onboarding and register
  2. They had their Nectar card with them
  3. They were going to set up a shopping list and scan the items on their list
  4. Finally, scan a QR code to complete their shop

The test itself took place in the Sainsbury’s usability lab, where I set up random produce items around the room on different levels of surfaces so that participants would walk around and gather what they needed. I also supplied them with a bag, since one of the SmartShop selling points is scanning products and then placing them immediately into your personal bag.

What usability testing started to uncover

I love usability testing and in this case, we learned a surprising amount of information that was invaluable to the current updates and roadmap of the app. A few moments caught me off guard, like when we were asking participants to scan their Nectar card to get started. 70% of the participants physically took the Nectar card and pressed it against the phone screen rather than press the ‘Scan Nectar barcode’ button to activate their phone camera.

“I’m scanning my card!”

Around this stage of scanning a Nectar card, I asked all of the participants on what they thought would happen after they scanned their card. What kind of features would they believe are offered to them now that we have their Nectar data? Repeatedly, the responses came back as:

  • I guess that means my experience will be personalised…
  • I should be able to see my total points
  • I should be able to see new offers from Nectar in this app
  • I can spend my points, and you will also store my vouchers rather than giving me paper vouchers

As we continued through the various tasks, other patterns began to emerge, such as the awkward switching between viewing their list and viewing their basket. A few participants were afraid to close their list believing that by pressing ‘X’, it would all be deleted.

The ‘X’ within the SmartShop header caused anxiety for our first time list users

There were a few assumptions that became validated after testing — ergonomically, the balancing act of carrying a bag, a phone, and holding a product was problematic. Many participants would keep the product on the ‘shelf’ and scan it there rather than wrestling with it, and their phone. For left-handed users, straining their hand to press the right side FAB button was frustrating.

Since these participants were all first-time users, after going through the process of using the app, the majority said they liked it but weren’t sure about using it regularly. Of the 3 accessibility participants, 2 of them felt using the app was fine. The third participant, however, pointed out how impractical it would be for him to use the app since he had to rely on a guide dog which restricted the use of his hands to holding onto a lead and carrying his groceries. Anything beyond this was impossible.

With usability testing completed, what I found was:

  • Onboarding was unclear on how the app worked — and several participants became convinced the max level of spending was £17.59 because of an illustration in the onboarding with marketing slogans.
  • The connection between Nectar and SmartShop was a tenuous one. We didn’t offer enough of the benefits of Nectar to keep SmartShop users hooked.
  • Setting up their camera and location was met with distrust — why was this needed? We lacked a strong explanation to customers.
  • Scanning was confusing — from scanning their Nectar card to scanning their products, the first few times were a little awkward.

Getting to know our customers

While I was finishing the usability report, Vicky started her analysis of the ethnography study feedback with the help of another researcher. As they combed through the mountain of information they had gathered, Vicky began to point out common problem themes that customers had brought up in the diary study: Scanning products (particularly loose items), the lack of connection between Nectar and SmartShop, and a lack of offers or incentives.

Our second researcher, Mara Protano (Evidenza Ltd), reviewed the analysis and identified several customer mindsets which determined to be:

Identifying our customer mindsets became the inspiration for prototypes that we created to use in a second testing session with 12 participants, where we tried ideas that would encourage them to become habitual users or at least entice them as a non-user.

Examples of the prototype ideas we tested were:

  • Setting up their dietary profile so that we suggested personalised products
  • Scanning products but detecting if an item had allergens — therefore alerting them not to purchase

The most common mindsets that were appearing in this second session were Efficient and Supporting. This means customers are concerned about using their time wisely, getting what they need in an organised manner and expecting assistance or suggestions from the app.

Additional questions around our marketing slogans, their understanding of the connection between Nectar and SmartShop and rescans process were also raised. The responses we received created an even wider picture of the entire SmartShop experience while validating what we’d been uncovering earlier on.

With all this extra information that Mara helped us uncover, I worked alongside the product team to prioritise what we needed to address in the app to improve our customer experience. We came up with a long list of stuff, and after months of testing, designing and iterations we have a number of new updates to the app.

What to expect next

Before I get into the fun stuff, I should probably mention that getting to this point has been a huge team effort. The pandemic created a greater sense of urgency to correct the issues we’d been uncovering, particularly as we saw a huge uptake of downloads in March 2020. So, between testing, designing and sharing ideas amongst the team — everyone has been keen to improve the customer experience of SmartShop.

Now for the exciting part! Some of the main points we took away from our research helped us towards the following actions:

  • New navigation so that switching between lists and basket view is much easier — we also centre aligned the FAB button for you lefties.
  • Redesigning shopping lists to be supportive for our customers who are looking for both organisation and inspiration
  • Nectar offers in the app
Screens with new SmartShop UI
Screens with new SmartShop UI
New SmartShop UI

As a team we move quickly and we’re already testing a new onboarding and registration flow, starting research on the rescans process, understanding customer habit formation, among many other things.

I’m so proud of the work our team has done and the amount of change we’ve accomplished in the past year alone. While we’ve been winning awards for our efforts, we have continued to push ourselves to learn more about our customers. Our research has been key to creating a strong product roadmap, and will continue to be the foundation for a better customer experience.

Senior Product Designer @ Sainsbury’s, focusing on SmartShop and in-store experiences 🛒

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store