Interviewing customers: idea generation or validation?

This morning I saw a series of tweets from Matthew Gunson:

I am trying to teach co-workers about the superior validity of the kind of data that comes out of interviews — unprompted and volunteered — compared to data extracted through direct questions. Is there any academic writing on this?

Trying to think of an analogy or comparison that conveys the marginal value of three separate pieces of volunteered unprompted data to dozens of discrete responses to a direct question. Almost feels like the latter is completely insignificant by comparison. — MatthewGunson

This is the sort of discussion I love to dig into, but 280 characters isn’t enough. So here’s my thoughts, Matthew. it went in a few different directions..

Before you can weigh the relative value of two different approaches, you need to start with what you’re looking to achieve.

I’m quick to think of these two different types of customer data as “qualitative vs quantitative”, although it may not be that straightforward. Probably a better descriptor is divergent vs convergent inquiry.

If you follow a line of inquiry and pull out emergent possibilities, you’re doing divergent work.

Jobs to be Done — Switch interviews (customer case research) do this by focusing on the timeline related to product purchase or trial. “Tell us the story of the scenario around your purchase, and details about factors affecting your decision will emerge”.

Jobs to be Done — Outcome-Driven Innovation Job Mapping interviews do so as well, although in a slightly more structured way. Interviews are focused around the steps to getting the Job Done, and where users struggle in what they’re trying to accomplish. The format of a Job Statement guides researchers and designers so they can ensure they’re providing customers with what they need to get their core Job done.

Faster Horses, what?

We’ve all heard the (apparently falsely attributed) Henry Ford quote that if he’d asked people what they wanted, they’d have said faster horses. This is because he imagined the conversation would be focused on the what, not the why.

By using the Job to be Done as the focus on analysis, we can leapfrog obvious solutions. When I was a Product Strategist at ReadyTalk, the sales team hated when they’d come to us with a feature request and I’d ask them “Why does the customer want that? How does it help him (X) better?”

It’s not the customers’ job to tell us what to build

When you ask a customer ‘what should we fix’, you’re asking them to focus on you. That’s backwards. You should be focusing on them and what they’re trying to accomplish. Instead of asking “do you like this” or “how could this be better”, spend the time asking them about what they’re trying to achieve and where they’re struggling. Don’t probe for product suggestions.

Back to What You’re Trying to Achieve

Interviewing Customers for Inspiration, not Direction

Sounds like you’re working on an existing product, and trying to improve it. What does “improve it” mean?

Mike Boysen recently released a great post about the Three Levels of Progress a supplier can focus on in improving his offering.:

  1. Help customers get the Job Done better
  2. Help customers get more of the Job Done on a single platform
  3. Help customers get more Jobs Done on a single platform

Sometimes as designers, product managers and developers, we’re a bit down in the weeds. We are focused on user metrics like “decrease churn” or “increase daily active users”. These map well to the levels of progress Mike outlines, even as they’re at a slightly higher (product portfolio strategy) level.

Let’s say we have a goal to decrease churn or increase time spent on site. We could conduct user interviews to understand the Job they’re trying to get Done, and how we fit into their workflow.

Rather than ask in a pure open-ended fashion “what should we improve”, you could walk through a Job Map to identify what part of getting the Job Done they’re most struggling with along each part of the process. (More information at: )


This way, you’re getting a more comprehensive view on what they’re trying to accomplish.

I’ve long been a fan of the“if you had a magic wand, what’s one thing you would fix” question, but the feedback is of limited application. Basically, it surfaces low-hanging fruit that may or may not actually make a difference in the users’ behavior. One option is to ask this question at the very end of an interview, after the users has already walked through their process mentally, and then see what rises to the top as the most pressing. But don’t make the mistake of assuming this is the one thing you should fix :-)

Unstructured inquiry can help with “what could we do” but not “how should we do it”

How Should We Do It?

So you’ve talked to customers about what Job they’re trying to get Done. You’ve identified certain parts of the process that are more laborious or have more potential for error than ideal.

Stop. Don’t ask your customers how to fix it. That’s your job.

This is where the skepticism about customer research comes from. People have NO IDEA what they’re going to do in the future. We laugh at new product concepts.. until they work for us.

The reason both Jobs to be Done approaches focus more on understanding the customer Job than co-designing solutions is because no one can predict the future.

Switch interviews focus on how to improve the likelihood of trial and non-churn of a product. ODI focuses on how to improve the efficacy of a product in getting the customers’ Job Done.

.. ok, that’s it for now. would love thoughts/impressions/suggestions on where to take this!

Learned something? Click the 👏 to say “thanks!” and help others find this article.

Andrea Hill is the principal consultant at Frameplay. Frameplay is an innovation consultancy that helps companies become more customer-focused and thrive in a rapidly changing world. Learn more at