Supporting better indoor wayfinding through conversational AI.

Tim Bettridge
Voice Tech Podcast
Published in
11 min readAug 7, 2020

This article takes you through a project by Connected Labs — Connected’s in-house product research group — that explores how we can make grocery shopping faster and easier and safer in the COVID-19 era.

The Problem

As cities around the world began to lockdown, and the reality of what a pandemic meant settled in, the insecurity that the public was feeling led to panic shopping. Hand sanitizer, shelf-stable food, and toilet paper became rare commodities. We saw unprecedented scenes of empty shelves and overzealous hoarders. The grocery stores were a complete mess with frontline grocery workers overworked and overwhelmed. Months later, grocery chains have had to adapt quickly with new guidelines to increase efficiencies of both the merchant and customer experiences, limit store capacity, and sanitize everything. And while the supply of toilet paper has been restored, we’re still seeing outbreaks occur in these highly-trafficked and essential indoor spaces.

With rapid changes to the needs of both the customer and the merchant there presents an opportunity to innovate and help support these critical frontline workers. Our team saw store wayfinding as an underserved, unmet need, and one that is being exacerbated by both the needs of a new customer — gig-economy shoppers, as well as the new need of store clerks to physically distance from shoppers. It led us to ask the following prompt:

How might we improve indoor wayfinding for retail and grocery environments to better support consumers, store staff, and gig economy shoppers?

Our Approach

With a team composition of two engineers and one product designer, we set out a one-month project plan to research, ideate, define and develop a prototype that would allow us to validate and assess our concepts for feasibility, viability, usability, and desirability.

Research

Depending on our desired outcomes for our internal research projects and given the lower risk that internal projects present we can sometimes take a leaner approach to immersion research to get into the making quicker. With a time-box of four weeks we took this leaner approach to get to the making quickly, while still ensuring we had a grounding in our users’ needs; an understanding of what products exist for them today and a good level of technical knowledge that would allow us to assess our concepts for feasibility.

User Research

Wayfinding at the grocery store is important to everyone involved, and we divided our customers into three distinct groups:

How might we improve indoor wayfinding for retail and grocery environments to better support consumers, store staff, and professional shoppers?

While all our user segments experience the store differently we can find a more viable product-market fit if we can address pains that overlap between these different groups.

A method we like to include in many of our more ‘lean’ lab projects to understand our users better is digital ethnography, for this project we leaned on almost entirely digital ethnographic scans and literature review. Digital ethnography refers to carrying out ethnographic market research in an online space. Some benefits of this research methodology include:

  • Access the participant in their natural context and environment.
  • Online observations are less intrusive and immediately apparent.
  • People’s behaviours are not as likely to alter as much as they would, should a researcher be physically present.

By conducting a scan of forums, subreddits, and Facebook groups we were able to record over 120 unique and connect them with specific insights into our user segments. Here’s a look at just a few of the anecdotes of shopping perspectives that we found:

Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com

Benchmarking

It’s always important to understand what products and solutions are available to your customers to better understand how they solve your user segments’ pains and amplify their gains. It allows you to identify if there are any unmet needs, identifying a potential opportunity.

We looked at traditional wayfinding in the form of aisle signage, indoor maps, and directories placed on aisle ends and also mounted shopping carts themselves. The problem with these analog approaches is that they are rigid, and costly to update. When things invariably are remerchanised (moved around the store) the signage systems themselves need to be updated, a costly and time-consuming procedure. To mitigate this, these wayfinding systems tend to be overly broad and not specific enough to meet the highly granular nature of customers’ queries.

Newer approaches include mobile applications for companies like Walmart’s that include aisle numbers and stock levels, and while an aisle number is better than nothing, it doesn’t help to properly orient shoppers and let them know what direction in the two-story, 100,000 square foot Walmart they should be heading. Where is aisle 72 anyways?

Lowe’s home hardware stores collaborated with Google to bring mobile AR (augmented reality) directions to shoppers. It suffered for many of the same reasons indoor AR navigation has failed in the past, primarily the scalability of indoor mapping and blue dot navigation as well as the democratization of specialized hardware. In addition, with both of these approaches, the user is also expected to be handling and operating their smartphones — while also pushing a shopping cart and grabbing items off the shelf.

Concept Generation & Evaluation

With a better grounding on the unique needs of our customers, we turned back to our initial opportunity framing.

How might we improve indoor wayfinding for retail and grocery environments to better support consumers, store staff, and gig economy shoppers?

By looking at the overlapping needs of our customer segments and by scanning what solutions are available today we were able to spot a potential gap of unmet user needs. We knew that all of our customer segments would benefit from a product that would serve the store with automated and detailed in-store navigation and that it needed to be intuitive and accessible to everyone.

The problem with the traditional wayfinding is that it’s too broad when displayed in the form of an aisle sign, and too rigid and inaccessible when directories or printed maps are used. The problem with the new approaches was that the availability of necessary technology for detailed indoor navigation is not widespread, and the usability of traditional mobile phone applications when shopping is occupying the customers’ hands and eyes.

Our two winning concepts solve this by being handsfree while our users are shopping, and they can be implemented on a range of platforms and modalities, in order to be accessible to a diverse range of shoppers. They also don’t presume the store has bluetooth beacons or spatial maps for advanced AR or blue dot navigation. By leveraging conversational AI we are ensuring that all of our customers can interact with our service, regardless of their technical literacy.

Prototype Development

When deciding on what type of prototypes we needed to validate our concept for usability and desirability we aren’t simply considering high or low fidelity. We actively consider all of the following metrics in crafting our scope and approach:

To better understand the usability and desirability of our concept we wanted to test it in the store context with real customers. To do this requires a functional prototype with dynamic flows and real data. We want to assess it’s utility and desirability so we are aiming to have a relatively high experiential fidelity, but since this is just a prototype we feel that our rendering fidelity can be lower fidelity than we would otherwise aim for in an MVP.

Mapping

To build a store navigator you need to know where everything is. Since we need high experiential and functional fidelity in this prototype we need our data to be accurate so that when we test with users they are getting an experience that they can properly assess for its desirability and usability. Mapping a grocery store is a bit daunting at first, we used a notepad and camera to get the general location of aisles and categories, as well as an audio note-taking application paired with a wearable microphone, for detailed on-the-go voice notes.

To start, begin with a base layer of the structure and shelvings/fridges. Next, it’s important to mark the aisle numbers and traffic flows, followed by a high-level listing of item categories (taken from the above aisle signage) and lastly, cut everything into ‘zones’ which are used to help provide directions to general locations within the store.

We chose a smaller NoFrill’s grocery store in Parkdale (A neighbourhood in Toronto) for its size and proximity to our location. Here is what our initial map looked like.

Anthony’s NoFrills — Parkdale, Toronto

Database

As well as mapping the store layout and high-level categories we needed to know at a lower level the items in each of our zones. This takes a lot of manual surveying and data entry that needs to be stored in a database. We began with a Google Cloud SQL database but realized quickly that for this tool to be managed and accessible by the store management it would have a technologically accessible frontend, and this front end wasn’t in scope or needed for our testing purposes. With our short timeline, we opted for integrating with Google Sheets through the Sheets API in order to accommodate these users. This kind of ‘Sheets database’ can easily be accessed by anyone with a google account and for most, it’s a familiar and intuitive experience.

Here we can see the first row of our database. In the first column, you can see Zone 0a, it will be seen as an entity by our dialog manager. It’s identified when the user says any of the products in the second column, which are seen by our dialog manager as synonyms to Zone 0a. In the third column, we have a spot for location-agnostic directions. The location-agnostic direction is important because we want our solution to work independently of the need for hardware infrastructure. It’s also the natural way in which a person might describe a location, and a bit more helpful than an aisle number. Lastly in the last three columns, we have Place IDs, latitude, and longitude coordinates. These are all used by our routing algorithm in the second concept (for the Store Navigator Kiosk).

Dialog Flow & Mapwise

Both our prototypes utilize the same conversation map, it’s a very simple query and response flow with some minor conversation repair for when our users ask for something that’s not in our database. The directional responses are in our database and can be changed very easily for those with access permissions in the Google Sheets document and they will be instantly reflected in our conversational agent. We’ve used Dialog Flow as our dialog manager and the Google Actions integration. Dialog Flows many integrations are worth highlighting as a benefit of using this platform. We can quite easily take our work and deploy to new platforms, for instance, an Alexa Skill. There is even a telephony integration that would allow us to assist with a phone call or SMS exchange, making this product even more accessible.

We also wanted map visualization and pathing to be in our second concept — Store Kiosk. This was accomplished using Google’s Interactive Canvas for Actions as well as an API for Mapwize, an indoor mapping platform. Canvas allows our Google Action (voice application) to also control events within a web app which is viewed during the Actions interaction on a smart display device like the Google Nest Hub. It allows us to provide a visualization that compliments our Action and is not constrained to the standard info card, media responses, and table templates found within the Actions SDK. This allows us to bring in the visualization that we receive from Mapwize and pair it with a transcript view of our conversational interactions, which we believe aids in the usability and accessibility of the Store Kiosk.

Demo

Concept 1 | Store Navigator

Our first concept is a Google Action that offers a simple query and response interaction. It is a voice-first product that can be invoked and utilized completely handsfree using a ‘hearables’ (wearables for your ears) like google’s Pixel buds, or with a standard pair of wired headphones (with an in-line microphone). You can also use it with a mobile device. You can see the conversational interaction here:

Concept 2 | Store Kiosk

For our Store Kiosk concept, we’ve developed a complimentary visual component that can be displayed on a smart display within an in-store self-serve kiosk. This includes the map layout as well as a pathing/routing visualization that takes the user from the kiosk location to the location of the product they are inquiring about.

Store Kiosk running on a Lenovo Smart Display

Next Steps

Demand Validation & Usability Testing

Now that we have two functional and experientially complete prototypes we would like to test them with actual users. We’ve deployed alpha versions of our prototypes which means we can include external users in our testing. We’re currently in the process of recruiting local shoppers in Parkdale who would like to try using Store Navigator on their smartphone, and will follow-up with them to solicit feedback about the desirability and usability of our product concept.

Additional Exploration

We’ve learned in this process that mapping indoor spaces and managing merchandising databases are a difficult challenge to scale this concept. There aren’t widespread standard formats or tools to do this currently. This has prompted us to continue our exploration into indoor wayfinding and more specifically target this newly identified opportunity frame. We’d like to ask the questions of:

How might we empower retailers and shoppers alike to intuitively and easily create and update maps and databases for indoor retail environments.

Something just for you

--

--