Sitemap
How This Works co

Musings and writings from How This Works co about product, strategy, design, workshop facilitation, team dynamics, and technology

Follow publication

Customer interviews 101

A lightweight framework for interviewing customers that actually works

7 min readMay 13, 2025

--

Second in a series of four articles

In part 1 of this series, I explained why customer interviews matter and how teams drift away from customer reality. But knowing there’s a problem isn’t enough — you need a practical way to solve it.

So how do we make customer interviews a practical, repeatable practice instead of an aspirational “when we have time” activity?

After trying a number of different approaches with teams, I’ve found a stripped-down framework that works in the real world:

Who should talk to customers?

Not just researchers or product managers. Everyone building the product should participate in customer interviews at least once a quarter. Yes, including engineers, product managers, designers, researchers, and even leadership. This creates a shared understanding that can’t be replicated through documents or second-hand reports.

I’ve become a huge fan of the watch party from the Bullseye Customer Sprint methodology. Instead of having just one person conduct interviews while others read the transcript later (which let’s be honest, rarely happens), you bring a small cross-functional group together to observe customer conversations in real-time.

Here’s why it works: When your engineer, designer, and product manager all witness a customer struggling with the same feature, there’s no debate about priority. The shared experience cuts through opinion battles and creates alignment that no JIRA ticket or meeting summary ever could. Plus, having multiple perspectives means different team members catch different things — your engineer might notice technical confusion while your designer spots navigation frustration.

These watch parties also democratize customer insights. You’re not waiting for the research team to process and present findings — everyone is developing their customer intuition together. I’ve seen teams solve problems on the spot that would have taken weeks to address through traditional channels.

The key is keeping these lightweight — 3-4 team members observing, one person facilitating, and a simple shared doc for capturing insights. No fancy setup needed.

How often interviews should happen

In “Continuous Discovery Habits,” Teresa Torres offers the most practical guideline I’ve found: aim for weekly touchpoints with customers. Not quarterly surveys or annual focus groups — weekly conversations.

That might sound a lot but these don’t need to be formal 60-minute research sessions. A 15-min call to validate a specific assumption counts. Or a 30-min screen share to watch someone use a feature counts too.

The consistency matters more than the duration. Once a week nets out to about 50 customer interviews a year. Which is better than zero.

But it’s not just about the quantity — it’s about building a habit for your product, program, or experience that keeps customer perspectives fresh in your all’s mind. When teams interview customers sporadically, they treat each session like a major event and expect earth-shattering insights. This creates pressure that can reduce the value of the conversation.

In contrast, when interviews happen weekly, teams develop a more natural rhythm. It’s normal. It’s business as usual. The pressure for each session to deliver breakthrough insights disappears. The team notices patterns across conversations. You all become comfortable with the silence that often precedes the most honest answers. And there’s relationship building with customers that encourages them to share things they wouldn’t mention in a one-off call.

Two designers reviewing a paper prototype with sticky notes while conducting a remote customer interview with a participant visible on the laptop screen
Two designers reviewing a paper prototype with sticky notes while conducting a remote customer interview with a participant visible on the laptop screen

I’ve tried both approaches. Those who schedule massive quarterly research pushes often end up overwhelmed with information they can’t process. Teams that talk to just one customer every week build a continuous flow of insights that naturally inform their decision-making.

The regular cadence also means you catch problems earlier. Rather than building for six (6) months based on assumptions and then discovering a fundamental flaw, you’re course-correcting every week based on fresh information.

It’s like the difference between cramming for an exam versus studying a little each day. One might get you through the test, but the other actually helps you learn.

What makes questions effective

The problem with most customer interviews is that they’re accidentally designed to get bad data. When we ask questions like, “Would you use a feature that does X?” we’re begging for polite but useless answers.

Think about how people respond when asked about their gym habits. Ask someone “How often do you go to the gym?” and they’ll likely say “Every weekday, 5 days a week!” But ask “How many times did you actually go to the gym last week?” and you might hear “Twice… my kid was sick one day, and then I had two errands I needed to run.” That second answer reveals reality, not aspiration.

The same principle applies in customer interviews. Better questions focus on past behaviors, not future intentions:

  • “Walk me through the last time you tried to solve this problem.”
  • “What’s the most frustrating part of your current process?”
  • “What else have you tried before this?”

These questions uncover what people actually do, not what they think they might do in some hypothetical future. They bypass the customer’s desire to please you or appear consistent with their self-image, instead tapping into concrete experiences that reveal genuine pain points and needs.

When a customer tells you what they did yesterday rather than what they might do tomorrow, you’ve struck information you can actually act on.

How to avoid drowning in data

One legitimate fear about regular customer interviews is ending up with mountains of hard to process insights. The antidote is simple: capture and synthesize immediately after each conversation.

You can combat your biases before they take hold. Before we even start the interviews, I have the team write down what they expect to hear. This pre-interview exercise is crucial for combating confirmation bias. We document:

  • What we believe customers will say
  • What problems we think they have
  • What features we assume they want
  • Any industry “common knowledge” we’re operating from

This creates accountability. It’s too easy to selectively remember just the parts of an interview that confirm what we already believe. When we write down our assumptions first, we can’t unconsciously filter what we hear to match our expectations.

Then, capture our insights immediately, while it’s fresh. After each interview, my team uses a shared document where we immediately note:

  • What surprised us (compared to our pre-interview beliefs)
  • What confirmed existing beliefs
  • What contradicted existing beliefs
  • Any direct quotes worth remembering

This lightweight approach means insights get captured without creating a data management nightmare. The quick turnaround is essential — insights degrade rapidly, like fresh produce left on the counter. What’s crystal clear right after a conversation can become fuzzy even 24 hours later.

The comparison between what we predicted we’d hear and what we actually heard is often the most valuable insight of all. Those gaps tell us where our product strategy might be built on faulty assumptions rather than customer reality.

The false economy of skipping research

I often hear teams say they “don’t have time” for customer interviews. This reasoning has always puzzled me. We somehow have time to build features nobody wants, but not time to talk to the people we’re building for?

Let’s do some back-of-napkin math:

  • A typical feature might require 2–3 weeks of design and development time from a small team. That’s roughly 200–300 person-hours of work. If that feature misses the mark because we didn’t understand the underlying need, those hours become a sunk cost.
  • In contrast, five (5) one-hour customer interviews (plus prep and synthesis time) might take 10–12 total hours. That’s still more than a 10:1 ratio of potential waste to preventative investment.
  • Thinking we don’t have time for customer interviews is like thinking we don’t have time to check the map before driving — it only makes sense if you enjoy taking scenic detours and have the time to do it.

In the next two parts, I’ll guide you through the entire journey of making customer interviews a core part of your product practice:

  • In part 3, I’ll take you inside a Fortune 100 healthcare financing company that discovered their entire understanding of customer barriers was wrong. You’ll see how just 10 customer conversations over two weeks unlocked more than $2 million in retained revenue — a concrete ROI that makes the business case for this practice impossible to ignore.
  • And in part 4, I’ll break down the simple customer interview approach we used at Sesame to launch a new product in just three months with minimal resources. You’ll get a practical blueprint for implementing what I call the “minimum effective dose” of customer research — just five conversations that can transform your product direction.

If you missed it in part 1, I explained why customer interviews matter and how teams drift away from customer reality.

In my experience, customer interviews aren’t a nice-to-have research activity or a luxury. This one action separates companies that build what people want from those that build what they think people want.

Thanks for reading. My name is Skipper Chong Warson and I run a product strategy and design services practice called How This Works co. What techniques have you found effective for making customer conversations a regular part of your product development process?

Citations:

  • Teresa Torres — “Continuous Discovery Habits” (2021)
  • Google Ventures (GV) — the watch party paradigm from the Bullseye Customer Sprint methodology, from Learn More Faster
  • Rob Fitzpatrick — “The Mom Test: How to Talk to Customers and Learn If Your Business is a Good Idea When Everyone is Lying to You” (2013)
  • Tomer Sharon — “Validating Product Ideas Through Lean User Research” (2016)
  • Erika Hall — “Just Enough Research” (2013, second edition 2019)

--

--

How This Works co
How This Works co

Published in How This Works co

Musings and writings from How This Works co about product, strategy, design, workshop facilitation, team dynamics, and technology

Skipper Chong Warson
Skipper Chong Warson

Written by Skipper Chong Warson

Product strategist, designer, and facilitator at How This Works co, host of the How This Works show. Fjord, thoughtbot, SoftServe, and Shep (acquired) alumni.

No responses yet