Designing a Scanning Experience for Nordstrom Fulfillers (Part 1)

Sabrina Weschler
8 min readDec 3, 2019

--

I’m a user experience designer on the team at Nordstrom that designed an iOS mobile fulfillment app that’s used in all 100+ Nordstrom stores by 2,500+ fulfillers in the U.S. and Puerto Rico. This app is called OneFill, and it’s used by fulfillers who pick, pack, and consolidate items that are ordered from Nordstrom.com.

In this blog post, I’m sharing my design process and approach for an under-documented design area: Scanning features that integrate with the built-in iPod camera. In my next blog post, I’ll share my design approach for scanning’s complementary feature: Manual entry.

A user flow showing the happy path scanning experience

Design challenge overview

Scanning with specialized scanning hardware is a very different experience than scanning with a smartphone or iPod camera. While we investigated several scanning options, our focus landed on scanning with a native device camera.

Prior to OneFill, Nordstrom fulfillers were used to scanning with external hardware. An advantage of this method is that users can point the laser directly at a barcode, so they don’t have to focus on the device screen. With a camera scanner, the user must look at the screen and focus the camera on a barcode to scan. If the barcode is located above the user’s head, it makes it more difficult for someone to use the camera. Going into designing for the native camera scanner, we knew we’d have to work to mitigate some of these challenges.

Key takeaways and tips scanning with a camera

Camera view location matters

Have you ever tried taking a picture with your smartphone without looking at the screen? If so, I’m sure you’ve learned how hard it is to take a good picture without looking at the screen, and I’m sure you now learned how to point and focus your smartphone camera to capture the subject you want.

Using the iPod camera will be a new experience for fulfillers who are used to not having to be mindful about where the camera is located on the iPod and how it translates on the screen.

Designing for the camera focus experience for the OneFill app was tricky. I learned that you’ll want the visual translation of the camera as close to the lens as possible while making sure you don’t compromise the information architecture of the app, such as the location of the header.

Design solution

On an iPod and most smartphone devices, the camera lens is located at the top. Make sure to design the camera scanning area close to the top of the screen you’re designing.

Camera scanning area designed to be in close proximity to the camera lens

This way, you maintain a more natural user experience — when the user points the lens at a barcode, the image of that barcode is reflected on the screen close to where it actually is in space.

Choose the correct camera scanning area size

I shadowed fulfillers at Nordstrom stores during my discovery phase, and I noticed that they looked for many different items at the same time. So when it came time to scan the item, it was very important for fulfillers to have easy access to general item information so that they could make sure that they found the right item.

Design solution

Split the screen real estate between the camera scanning area and supporting information, such as the image of the item.

The scanning experience split between the camera scanning area and supporting information

The screen area that you allocate for the scanning task does not have to fill the entire screen. Use part of that area to show relevant information and form fields. We determined that the scanning area should occupy at least half the screen for an iPod with the dimensions 320px by 568px. (This ratio, though, may vary based on device screen size.)

You might consider adding a reference frame on the scanning area to help users visually frame the item tag or barcode. Technically, though, the item tag or barcode can be captured in any part of the camera scanning area.

Integrate touch-to-focus

Design solution

We learned that incorporating a touch-to-focus feature, if possible, offers a more expected user experience.

Because touch-to-focus is something that users were used to doing, this action was part of their conceptual model of a camera. This action clearly gave users a sense of control when the barcode took longer to scan.

Use instructional message overlays

Scanning becomes a bit complicated when users need to scan more than one barcode or when there’s a specified order for scanning barcodes. If there is some sort of complexity to the task at hand, instructions should be considered in the context of the design.

Design solution

You could use a semi-transparent overlay section over the camera scanning area. This way, even if the barcode appears under the instructional message, the camera could still pick up the barcode and give more screen real estate to the camera scanning area.

Semi-transparent instructional message overlay

Use loading indicators

Have you ever tapped a button multiple times in an app because you didn’t receive immediate feedback from it? Or worse, exited the app entirely and restarted it? With apps that load lots of data on a continuous basis, what actually could be happening is that the tapping action has sent a backend call and is simply waiting for a response.

Design solution

If a scan needs to send a backend call, then make sure to include a loading indicator to let the user know that the app has scanned the barcode and is verifying that the information is correct.

Loading indicator displayed on an instructional message area

A loading indicator is a common element that most users will recognize as a “wait” sign from their use of consumer-facing apps or websites.

Use clear success indicators

After the initial scan designs and flows were developed, we conducted a usability study with live code. I immediately noticed that participants struggled to understand whether their scan was successful. I had only included visual feedback on a successful scan at first, and the users barely noticed a difference. They wanted a more prominent success indicator.

Design solution

One distinctive way to indicate success is to enable haptic feedback on a successful scan. This design solution also enables people with disabilities.

Another way to indicate success is to include a sound, but this solution could be disruptive, especially if the app is being used around other employees or customers.

Enable easy access to the camera

Have you ever used an app that required you to use your smartphone camera but you hadn’t allowed access to the camera yet? You’d have to leave the app, go to Settings, then find your app in Settings, and then enable the camera.

If a user had not granted the app permission to access the camera, then you would expect to see general guidance or step-by-step instructions that would guide the user through enabling the camera. For example: “To scan barcodes, you’ll need to enable camera access.”

Design solution

To set up the most efficient user experience, deep link the app screen in Settings from your app. If you do, then all that the user has to do is tap the deep link button, which takes them directly to the setting where they can enable the camera.

A deep link experience that takes the user directly to the device Settings for enabling OneFill camera access

Set form fields in an initially deactivated state

I designed the scanning experience before manual entry — where the user manually enters information. One oversight in designing in this order was that each form field would need to be interactive for the manual entry feature. When I had only scanning implemented, the form field was active before scan. It’s important to note that if the form field was activated, it sends the wrong visual cue to the user — that nothing will change when they tap the form field. In reality, it would open up the manual entry experience.

Design solution

Form fields should start deactivated when the scanning screen is open. After the user successfully scans a barcode, the form field should activate and auto-populate with the scanned code.

Deactivated form field (left) and active form field (right)

Use the flashlight to help the camera scan

The smartphone camera is much more sensitive to light than a hardware scanner. If the user is in a dark area trying to scan a barcode, there’s a chance that the camera will either have a hard time focusing and reading, or not be able to read the barcode at all.

Design solution

Incorporate a flashlight functionality where the user can easily and quickly turn it on and off to assist with the visibility of the barcode.

Flashlight button in an “on” state

Separate system and user error messages

I was able to reuse my instructional message design patterns for both types of error messages to enable a more consistent user experience.

Design solution

System error messages are displayed over an overlay, and user (scanning) error messages are displayed in context of the error.

System error (left) and user error (right)

Because user errors require an action by the user, I added a color background and changed the font color to an appropriate contrast color. Colors can be visually challenging for some users, so I also highlighted the form fields where the errors occur and added an icon. For example, from our pattern library, we chose red.

Thanks for reading! Be on the lookout for Part 2 (manual entry feature), which is coming soon. I encourage you to share your thoughts and ideas on this scanning feature with me.

Sabrina Weschler is a user experience designer at Nordstrom. She designs apps that support Nordstrom employees and customers. She started at Nordstrom as a software engineer, and then transitioned into UX design. She is passionate about designing functional and user-friendly apps that empower users and increase their daily productivity and happiness. You can tweet Sabrina at @weschlersabrina.

Shout out to Karen Scipi (@karenscipi), our lead UX Writer/Content Strategist, for helping shape the words on this blog!

Views and opinions expressed in this blog post are those of the author and do not necessarily reflect the views of Nordstrom.

--

--

Sabrina Weschler

UX designer who’s passionate about cultural UX and designs functional and user-friendly apps that empower users and increase their productivity and happiness.