Lessons from the frontlines: product discovery

Tom Sautelle
Humans of Xero
Published in
7 min readNov 20, 2019

As a product person I have struggled to find writers on product management who are working on the frontline. This series aims to share my own thoughts and highlight valuable resources to help product people in the trenches.

Sometimes doing product feels like jumping into the unknown without a parachute.

And one of my professional development goals is to write more — so there is that as well. 😉

Discovery and delivery

For this post I want to focus on discovery and delivery. I was first introduced to this concept through Jeff Patton and it struck a chord with a common problem I had observed — teams spending months:

  1. 🏗 Creating production quality™️ features
  2. 🌀 Wading through long release cycles (including weeks of regression testing— something that needs a post of its own)
  3. 📊 Waiting for data on usage then spending ages analysing it
  4. 😖 Realising that those features actually suck
  5. ☎️ Talking to customers to find out why
  6. 📆 Planning improvements
  7. 🔁 Starting from 1️ again

This is a really, really expensive and demoralising way to figure out the right thing to build. You end up feeling like this guy:

Me thinking about doing another release cycle to fix that feature — and the fight needed to make space in the roadmap for a v2.

Enter discovery — The concept that you can have a much faster, lower cost way to figure out what works, by researching and testing ideas without writing production quality code (or running the release gauntlet 🏃).

This is done by recognising:

  1. Most ideas are not…. great
  2. The faster you can validate which ideas are any good, the easier it is to continually deliver features that make customers happy
  3. Discovery requires a different skillset and process than product delivery (Scrum and Kanban are not discovery frameworks, they are delivery frameworks)

Jeff summarises it like this:

Discovery and delivery model — Jeff Patton

So what does this look like in practice?

The (cliché) answer is that it depends. Each opportunity, idea or problem to solve requires different methods to identify if it is any good. But Tom, you called this Frontlines — I assumed I would get actual advice?! 😧

I know, I know. While there is no silver bullet (another cliché) there is:

a) A common principal — incoming venn diagram — that I think is fundamental

and

b) A couple of examples that come to mind when I think about doing discovery work from a product perspective

The product triad, or trio, or… whatever else people call it

The Product triad is not a new concept but I think it is required to do discovery well. The idea is pretty simple. For us to know if an opportunity, idea or problem space is actually worth pursuing, we need to test if it is:

  • 🏢 A real problem for our customers that is worth solving (aka a big enough opportunity for our business)
  • 👩‍💻 If it is feasible for us to be able to build a technical solution
  • 🔨 If the solution we could build is usable by our users
Atlassian — https://www.atlassian.com/blog/technology/engineering-team-structure

So what has this got to do with discovery? I believe that to truly test if an idea is worth pursuing, it needs to be validated from each of these perspectives, using tools relevant for that perspective. Given I spend most of my time in the product space, some common approaches to validating ideas from that perspective come to mind. They fall all into two categories: impact and reach.

Impact and reach

The following are some of my go-to practices I use when trying to evaluate an idea (⚠️ your mileage might vary 🚙).

I start by trying to gather qualitative and quantitative data to understand if the idea solves an actual problem, including how big a problem it is for the people who experience it (impact) and how many users have this problem (reach).

Impact

Understanding impact is all about listening to your users and understanding what they have to do when they hit the problem. It’s about empathy and is essential for getting buy-in later when you are asking people to help prioritise ideas.

You can get this by going anywhere your users talk about their challenges: feature requests, slack channels, forums, sales folks, reading support tickets, and any other source of actual users talking about their challenges.

And you can interview them. Nothing is quite as powerful for understanding impact as having a human describe it to you first hand.

A recent case I was looking into had a low number of support tickets per month, but high impact on users and a huge time investment from our support and product teams for each case.

Reach

If I can establish some kind of behavioural (anti) pattern or user flow that users follow because we are not solving this problem (see impact ), I follow up with quantitive behavioural analysis:

  • I currently use Google Analytics, BigQuery and SQL if I get desperate to quantify the % of users in different segments that follow the anti pattern

Also, if there is a common problem in your domain that your product doesn’t solve, your support team (almost) always already knows about it. They can be a great source of understanding reach by:

  • Looking for trends (cases 📈 or 📉), including frequency that users experience the problem and % of customers impacted. It’s important to capture this over time so you get a sense of the time spent helping customers with this problem vs fixing it
  • In my context I do this by getting our CX lead to help me pull the data from Salesforce, then bring it into a Google sheets and mess with date columns, then into Data Studio and graph the above up into a nice report

How do I know it’s worth it?

Using some of these approaches helps me make sure that I have understood the problem, if and why we should solve it and how important it is. I learnt this the hard way with the first major feature set I worked on as a brand new Product Owner in a previous role. I am sure some of you will be familiar with the scenario:

  1. The problem we were solving had been defined as a set of new features that needed to be delivered (in 3 months because… reasons)
  2. There was no need for any research, validation or prototyping as there was so much confidence that polishing and productising our existing work-around was the right solution — we already had “so many” customers using it
  3. We had already announced that we would be delivering it… 2 years ago… so the pressure was on

So being the inexperienced product person that I was, myself and my teams ran with it and hustled to hit the timelines. We got a M(V)P (aka a minimum product that missed the viable and/or valuable part) out in time for the launch of our major release and… silence.

Well, at least for 6 months. Then a flood — customers and partners angrily complaining that the feature sucked and made no sense. To make matters worse, we also got the capability to measure our users’ behaviour in our application around the same time, and our shiny new features user adoption was painfully low. My stakeholders were not happy and I was devastated.

So what does this story have to do with discovery? Well, after we realised our failure, we did what I would now do from the start: we did product discovery.

  • 😠 We interviewed customers about the problem we were trying to solve, to check if it even mattered to them. Fortunately for us, they did have the overall problem we had based the feature on which was good. It was also why they were so mad with our crappy solution!
  • 🔭 We built many prototypes of new solutions to try and figure out how to solve it well, spoke with our customer support teams about the issues that were being reported, met with subject matter experts to identify any gaps, and did multiple releases and continually monitored our adoption
  • 🍾 After 6 months, we had lifted the adoption of our feature from terminal to one of the most used features in our product — and had proved the importance of product discovery and usage analytics to our leadership team

Just think, if we had done discovery at the start I would have saved nine months (and developers and customer support people would have saved countless hours) avoiding the trauma inflicted on us and our customers by a big launch going wrong. And I would have got a whole lot more sleep to boot!

That’s a wrap 🎬 for this post. Let me know what you think about product discovery and what tools and frameworks you use. Stay tuned for my next article in the series!

📽 📖 If you want more resources on product discovery, see below 👇

--

--

Tom Sautelle
Humans of Xero

Working on the frontlines of Product Management in the software industry. Writing about valuable learnings and resources for others in the trenches.