Marketade
Published in

Marketade

Quality vs. Quantity: Moderated Usability Testing Unlocks Problems in a Bank Account Onboarding Flow

Photo by Marjan Grabowski on Unsplash

“How can you possibly hope to uncover our website’s problems by interviewing under 10 people?” It’s a question we often hear — in some variation — from prospective clients.

Our team can point to many cases where fewer than 10 moderated UX testing sessions provided rich insights into a website or application’s barriers, both major and minor. When we’re having this discussion with clients, we usually cite the following reasons why small numbers can make a big impact in moderated testing:

  • Careful recruiting with additional phone screening ensures that the small number of users aligns with the target audience. When we’re more selective, we can get more relevant feedback.
  • Our experienced researchers can ask follow-up questions and probe further when we sense that a test participant is hesitating or encountering a problem. That way, we get to the root of a problem instead of expecting users to report what frustrates or confuses them and why.
  • Live moderation enables us to create a rapport with the interviewee and put them at ease so we get more relaxed, genuine input. Think about it: would you give better answers while typing into a form or casually talking to a good listener?

But we know that, in many cases, seeing (and hearing) is believing. If you haven’t observed the power of moderated testing firsthand, you may find it difficult to imagine how a small sample can yield such valuable results.

One of our recent clients felt strongly about obtaining data from large numbers of users through quantitative methods. At Marketade, when a prospective client challenges us, we’re grateful: their skepticism gives us a chance to check our own assumptions. After all, we’re in the testing and experimentation business.

So we struck a compromise: we’d conduct fewer than 10 moderated, qualitative testing sessions and run 100 unmoderated, quantitative sessions. By doing both, we could evaluate the pros and cons of each method. (Spoiler alert: quality outweighed quantity when it came to useful findings in the end.)

Challenge: Identify remaining barriers in an account opening flow

Our client, an online-only bank with more than $40 billion in assets, redesigned its saving account onboarding flow. User drop-offs at several points in the process prompted this overhaul. The new concept shortened and streamlined the account opening flow.

Before releasing the revised design into the wild, the bank’s team wanted to make sure that the flow worked as smoothly as they hoped. Their design firm created high-fidelity mock-ups to test before an additional round of improvements.

  • What problems remained to be solved?
  • Did the new order of screens and tasks make sense to users?
  • How could the design and content be further improved?

Action: Conduct moderated and unmoderated user experience testing on the flow

We converted the bank’s savings account application mock-ups into a simple Axure prototype. This test version allowed users to go through the flow and enter information as needed, although some functionality was limited.

For moderated testing, we:

  • Recruited 9 participants that closely fit the client’s target profiles: in the desired age ranges, most with over $25,000 in savings, and comfortable with the idea of opening a savings account with an online-only bank.
  • Phone-screened each participant to confirm their answers, gauge their frankness, and learn more about why they planned to start a new savings account.
  • Spoke directly with participants during 30-minute Zoom sessions as they shared their screens and webcams.
  • Watched participants complete the prototype flow as they shared their reactions in real time. We asked for clarification around any friction points we noticed.

For unmoderated testing, we:

  • Recruited 100 participants in the target age ranges who indicated in a screener survey that they were open to online-only banking.
  • Sent participants to independently complete the prototype flow in Loop11, a quantitative testing platform.
  • Included form-style questions once users had gone through the flow. In writing, participants rated the ease or difficulty of the flow and shared their biggest unanswered question and frustration point from the process.

Result: Deliver moderated testing insights into form barriers — backed up by unmoderated testing

When we presented our findings from the two rounds of testing, both proved valuable to the client. However, the unmoderated results would not have taught us much without the moderated results. By contrast, the moderated results could have stood on their own, despite the welcome validation of the unmoderated data.

For example, a common pain point reported among the 100 unmoderated users and the 9 moderated users was funding a new account.

What exactly was the nature of the problem? Unmoderated users expressed uncertainty around the funding terms and timeline, but their answers were brief. Moderated testing revealed a fuller picture of the problem: lack of clarity about how next steps would work, confusion around specific terms, discomfort with the security of instant verification, and annoyance with the idea of additional tasks. Based on those findings, we recommended:

  • Add explanatory copy on mobile to clarify the choice between funding methods.
  • Revise and clarify copy related to funding verification on several specific screens.
  • Emphasize time expectations. Consider making “How long it takes” the first explanatory bullet for each funding method.
  • Rewrite headings for the final screen so that they are more direct instructions and less technical.
  • Consider adding a visual, like a graphic or a short video demo, to illustrate how the verification methods work.

Moderated interviews made us and our client realize how strongly consumers care about fees. Almost all of the participants tried to click on the Fee Schedule link or asked about fees. While some unmoderated users documented this issue, the weight of interest in this information would not have been clear without the moderated sessions.

The video clips from moderated sessions also sparked more discussion and engagement in the final presentation. The data from the unmoderated sessions helped to confirm our conclusions from the moderated sessions, but did not offer the same kind of compelling window into consumers’ emotions and preferences.

Quite simply, watching and listening to your consumer base is an eye-opening experience. Moderated participants can make a point on a personal level that written answers and charts cannot. It’s easy to underestimate a barrier until you’ve seen and heard a consumer struggle with it in real time.

When expertly recruited and facilitated, a small number of moderated sessions can not only identify problems, but also suggest the reasons behind those problems. Unmoderated testing may tell you what is wrong, but moderated testing can give you the why behind user actions — the key to solving complex problems.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store