Discovery Techniques

User Research with Prototypes: Asking the Right Questions

Dan Brown
EightShapes
Published in
6 min readMay 15, 2017

--

Showing users a half-baked prototype is partially usability testing and partially user research. You get feedback on actual designs and you get to learn about their mental model through deeper questioning.

In conducting these interviews, I have a few simple tricks to get people to talk more about who they are, what they need, and what they’re thinking. Before I get to the tricks, though, I want to describe a simple framework I use for constructing these scripts.

A basic conversation pattern

Like a design pattern, this framework is a starting point. It gives me a basic hook, which I then embellish to suit the need.

1. Establish frame of mind: Participants need to look at the prototype from the right perspective. I’ll ask them a question simply to set up the next one, to make sure they’re approaching from the right mindset. By asking these questions — even ones I know the answer to — I’ve primed the participant for the next series of tasks.

Example: In designing an application to support portfolio reviews, I might ask a reviewer participant, “When you’re looking at someone’s portfolio, what kinds of things are you looking for?” Encouraging them to reflect on previous experiences, I’ve primed them to apply that perspective to the prototype.

2. Ask about expectations: With the participant primed, I ask them to describe what they expect to see in the product itself. This applies to a variety of situations. At the smallest level, it might be that they’ve pointed to a link or button they want to click. More broadly, I can ask about their expectations for the whole application. It’s the priming in the first step that lets me do this.

Example: In the app for portfolio reviews, the first screen is a list of people they need to review. I ask, “What do you think you’ll see when you click on a person’s name?”

3. Reveal and ask about alignment: One cardinal rule of user research is to avoid yes-no questions, like “Does this meet your expectations?”. My favorite construction for this is “How is this different from what you expected?” I could even remind them what they said earlier to reinforce that perspective. By asking how it’s different, I’ve encouraged them to contrast what they see with what they imagined.

Example: Once the portfolio reviewer clicked on a person’s name, they did not see a list of pieces in the portfolio. Instead, they saw a list of standard categories that portfolio items were assigned to. In this circumstance, the reviewers were all familiar with the categories. Though it was different from their expectation, none was surprised at the arrangement.

Redirecting on the fly

Creating these tests demands crafting a script that surfaces both design issues and user insights. The real magic happens during the interview itself. As participants talk, I’m actively listening, using the things they say as a cue to dig deeper. In these kinds of tests a few types of responses turn up time and again. Here’s how I turn those responses into more insights.

“I don’t understand why…”

Participants will express confusion about something by questioning the reason behind it. It’s pretty easy to turn this around to:

  • Why do you think it looks like this?
  • Why do you think it’s organized like this?
  • Why do you think this comes first?

Be careful not to explain the reason for something, then asking “Does that make sense?” or “Do you agree?” Those questions shed no light on the participant’s mental model of your system.

“I don’t know how this got here.”

Participants will question the source of the information they’re looking at. Part of our mental models for information spaces is that the contents have a known source, even if the exact identity is unclear. I turn this around to:

  • How do you think this got here?
  • Where do you think it came from?

Be careful not to tell them the source or explain the inner workings of your system.

“I think this is here because…”

If participants offer an explanation, that gives you a great opening for probing further. My favorite response to that is:

  • What makes you say that?

Be careful not to tell them they’ve gotten something wrong or that their impression is incorrect.

“I like that I can…”

If participants highlight something that works or they like, that’s also an invitation to probe further. When something clicks for them, there’s a reason, and it’s your job to uncover that reason. So, I ask them:

  • What’s a scenario where this might be important?
  • How would you use this in your day-to-day job?
  • How do you handle this today?

Be careful not to gloss over this as a compliment. It’s nice that they liked the design, but you need to find out what about it resonated with them.

“Does this product do or have…?”

Sometimes participants look to you to clarify the functionality or content of the product. If you’ve been involved with the design, it’s almost as if they’ve asked you about your child. You almost can’t wait to tell them all about it. To resist this urge, I turn the question around:

  • What do you think?
  • How would that fit in with what you see?

Be careful not to just answer their question right away. Even asking “Do you think it should?” is counter-productive, as it turns an insight into a yes-no question.

Keeping me honest

In the five techniques above, I caution you against directly alleviated participants’ confusion or misapprehension. You don’t answer their questions because there’s a slippery slope. Start answering their questions and you might start defending the product design. Defend the product design and you’ve primed them to be confrontational, rather than a cooperative.

At the same time, avoiding their questions entirely can make you seem less trustworthy. Once I probe a bit, I respond to their question. And then, as if I’d just shown them something on the screen, I’ll ask, “How is that different from what you’d expect?” They may reiterate an earlier answer. New information, however, can trigger additional thoughts.

Sometimes I want to prompt participants to elaborate. I’ll recap an earlier answer and ask them additional questions. When I do that, though, I’m careful to give them a chance to correct me. I want to make sure I’ve formed a correct impression of their mental model. I’ll say, “Here’s what I thought you said, but keep me honest and correct me if I’m wrong.” I give them permission to correct me, but also clarify themselves.

This is a different kind of testing than straight usability testing. We’re not entirely focused on completing tasks, or how long it takes to do so. Instead, we’re using the prototype to help uncover new insights about the target audience, not to mention get feedback on our design work. Working with half-baked prototypes can be daunting. The techniques here help you:

  1. Prepare a good script
  2. Focus on uncovering your users’ mental model
  3. Stay humble about your own impressions

Working on a new digital product? EightShapes can advise or drive projects to get you to a product definition and vision. We can help you create a prototype and test it to learn about your users. Want to talk about your work? Get in touch!

--

--

Dan Brown
EightShapes

Designer • Co-founder of @eightshapes • Author of 3 books on UX • http://bit.ly/danbooks • Board gamer • Family cook