What’s the right level of design fidelity for user testing?

Cormac O'Dwyer
intercom-rad
Published in
4 min readJul 21, 2022

--

At Intercom, one of our product development principles is Think big, start small, learn fast. One of the ways we learn fast is by doing evaluative research to build confidence in the product design direction we’re pursuing. We often aim to get product concepts in front of users very early in the product development process, long before any code has been written.

But what’s the right level of fidelity to present to your research participants?

It can be tempting to use the latest, most polished design files available, but higher fidelity is not always better. Participants will pay attention to the highest level of fidelity that you provide — if you’re at an early stage in the process and interested in whether a concept has value, hearing participants’ comments about microcopy on a CTA is likely to add noise to the signal you’re looking for.

Instead, we’ve found it helpful to start with the problem — what is it we’re trying to learn about? — and match the fidelity of what we show research participants to that. Getting clear on the problem up-front also helps create alignment with your cross-functional partners in design, product and engineering — all the more important because creating a research stimulus is normally a particularly collaborative effort, and one where ownership can be unclear.

(As an aside, we’ve also found it helpful to avoid using the term “design” when talking internally about what to show participants. Terms like “research stimulus” or “provocation artefact”, while they sound wonky, help communicate that the thing you’re going to show a participant is a tool to elicit insights, not to showcase a well-rounded design — there’s no need for your design partner to spend time taking it to a high level of fidelty; it’s OK to keep it rough.)

So which of the following are you doing?

Testing concepts — does this idea have value?

At this stage, your problem is: Does this idea have value? And specifically, do customers see the idea as helping them solve problems that you know they have? You’re not interested in how they’d interact with it. You’re interested in whether it would be valuable to them.

During the test itself, you’ll want to confirm that the participant actually experiences the problem you’re solving, show them the concept, and evaluate whether they expect it to solve their problem in a valuable way.

Your test stimulus doesn’t need to look like an interface at this point. In fact, it’s probably better if it doesn’t — if you do choose to present concepts as an interface, be aware that your participants may jump ahead to thinking about how they’d interact with it, how it would relate to the rest of your product, and what they think about the visual design. This kind of feedback is easier to give, and participants will gravitate towards it.

Instead, your research stimulus just needs to communicate the idea you’re testing, and little else. Quick text descriptions, storyboards, landing page mockups, diagrams in Google Slides, or whiteboard-style doodles are all legitimate options. Keep it simple and focussed by blurring unnecessary copy and using obvious placeholder graphics.

Testing system design — can users form a ‘good enough’ understanding of this system?

Sometimes, a project may require you to build out a whole system, rather than interfaces or flows. At this stage, your problem is whether, based on your work so far, your users can form functional mental models that will enable them to use the product without getting blocked or confused.

The best evidence for the quality of these mental models is that users can make accurate predictions about how they’d do representative tasks.

While you could simply show abstract diagrams of the system and ask them how they’d expect to use it, this doesn’t have much ecological validity; actually showing interfaces makes the research feel more realistic. Again though, we’ve found that asking a participant to use a prototype makes it almost impossible not to have their attention deflected to interaction design, and you’re probably still not solving for that.

Consider giving them realistic problems and asking them to talk you through how they’d expect to tackle them using your mocked-up interfaces, while you, the researcher, share your screen and look after the specific interactions that move them through the product. This will save you either building out fully interactive prototypes before you have confidence in the design direction, or risking frustrating participants with interactions you know are still janky.

Testing usability — can people use this product?

At this stage you’ll want to know broadly whether a user can interact with your product with a realistic level of support (e.g., no more onboarding than you’re likely to provide on production) and carry out tasks that represent the problem you’re solving.

There are many kinds of usability testing, but in pretty much all of them, interaction design becomes more important. You can test usability with prototypes of varying levels of fidelity, once you’re clear on the tasks you want your participants to carry out, and the interactions that you expect to enable them are supported in the prototype.

In summary

The lines between these phases are blurry, and it can be tricky to land on the right protocol for your situation. Prepare to tweak both the research artefact and the questions you ask from session to session. Also, there’s no harm in asking your participants how they found the research session itself, rather than the research stimuli. Confusion or frustration are a sign that you have some more work to do. And if you get stuck, remember to start with the problem: what exactly are you trying to learn from your study?

In Intercom, the Research, Analytics & Data Science (a.k.a. RAD) function exists to help drive effective, evidence-based decision making using Research and Data Science. If you want to help build valuable products and help shape the future of a team like RAD at a fast-growing company that’s on a mission to make internet business personal — we’re hiring and we’d love to hear from you.

--

--