What to do when desirability doesn’t cut it (as a starting point)

Dan (Head of Innovation)

“Guys, we’ve been doing some great work, but I’d like to challenge you. Spend the next month observing human behaviour, in any context you like, synthesise your findings and come back to me with the number one problem you believe people are extremely motivated to solve, but haven’t yet found a solution to. This will be the next product we work on and (ideally) take to market”.

Innovation Team

(Bewildered looks are cast around the room. For a moment there is silence, with no one knowing quite how to respond to the change of direction)

Belle (Lead UX Designer)

“Literally any context?


“Yep. Go get em.”

The reality is, unless you’re working on Moonshots, or exclusively in a research capacity, commercial product development teams have probably never experienced an interaction like the one above with their boss.

Here’s a scenario that more closely represents ‘common’ reality.

Dan, as the Head of Innovation, is tasked with driving revenue through new ‘non-core’ products, services and channels. It’s therefore his teams job to find and solve high-value problems that help Dan, and the company they work for, meet the objective of de-risking through diversification (i.e. establishing numerous sustainable revenue streams), and in the process, creating greater shareholder value.

Here’s Dan’s dashboard. This is the thing Dan relies on to give him a high-level snapshot of his department’s progress. He also finds himself digging deep into the data on occasion.

Coming into work on Monday morning, Dan decides to review some of the key metrics associated with one of the products the team released to market at the very beginning of the year.

He notices a few interesting things, namely that they’re experiencing extremely high churn in the current month. With a fairly clear understanding of the products unit economics, Dan has an ‘Oh Shit!’ moment.

The problem is pretty simple; if they can’t solve the churn issue, it’s likely that their CAC (Customer Acquisition Cost) to LTV (Customer Lifetime Value) Ratio will be terrible. The company will loose money and Dan will be left to pick up the pieces.

Dan’s got a business problem and he and the team have got to solve it.

So why aren’t we starting with desirability?

When Bianca and I were a little younger, and a little less seasoned, we likely experienced moments of blind faith. Human-Centered Design, and the simple notion of starting with the human desirability rather than a technical capability or commercial objective, was extremely promising. It was inspiring even.

So have we lost the faith? Not at all. We still do our absolute best to practice Human-Centered Design (with a few twists) as effectively as we possibly can.

What’s different, however, is that we have now been exposed to a broad enough spectrum of competing priorities that we truly appreciate, and greatly value, the balancing act that is design.

We now understand, that rather than starting with the human, a business need, objective or insight — something that has a fundamental impact on the way an organisation operates — is our frame of reference and driving force.

This is the thing that guides budgets, allocates bonuses, and often, decides who is going up, who is staying where they are, and who is departing the organisation for good.

So where exactly does this leave us?

Well, to be frank, it doesn’t change the way we, as product managers, designer and engineers operate all that much.

Let’s explain this by guiding you through the typical process we would execute to validate an insight or deepen our understanding of a business objective in an attempt to create value for the organisation we work for, by creating new and unique value for the organisations customers.

Our process — a high-level view

Our frame of reference, just as we outlined above in the example with Dan, is the insight or business objective. This sets the context for the way we approach our process, and may enable us to develop some clear and testable initial hypotheses that we can take to market.

At the highest level, our process consists of four key steps. These are:

1. Validating the insight or deepening our understanding of the business objective

2. Identifying the main and related Jobs to be Done (JTBD) that customers typically hire our product to fulfill

3. Determining the importance of the Job to be Done versus the satisfaction customers have with existing solutions (this helps us to determine the propensity for behavioural change), and

4. Through constant iteration and customer collaboration, validating we have fulfilled the Job to be Done and met the business objective (which occurs iteratively and incrementally over a period of time)

The number one rule of our process is that each step deeply involves the customer. This is us putting into practice Steve Blank’s motto of, “Get out the building.”

We’ve actually written about Jobs to be Done a few times, highlighting why the approach is an asset to any product or design team. We’ve also proposed a practical framework for using Job Stories instead of user stories. However, within this moment, the following quote should give you everything you need.

“If you remember anything about jobs to be done, remember this: they are completely neutral of the solutions you create (your products and services). While a customer JTBD remains fairly stable over time, your products and services should change at strategic intervals as you strive to provide ever-increasing value.” The Innovators Toolkit, 2012

Using the situation Dan finds himself in as an example, lets examine this further.

1. Validate the insight

This part of the process, like the others, will consist of various activities that are interchangeable based on what you’re trying to achieve.

Within this example, it’s likely we’d want to conduct some internal interviews to gather insights from the collective intelligence of our team. We’d also likely want to dive deeply into the usage patterns, particularly around onboarding and Time to Value (TtV). We could do this through Mixpanel or any other analytics capability we have in place. Additionally, we’d start speaking to some customers to start learning about their experience and objectives.

Within this phase we’d also explicitly set the context and timeframe for how we intended to go about tackling the problem we’d been presented with. To manage expectation internally, and receive the support you require to execute effectively, this is a critical activity.

After validating the insight, or diving deeper into the business objective, we would switch our focus over to uncovering the main and related Jobs to be Done.

You’ll notice this sounds alarmingly like, “starting with desirability.” In many ways that’s because it is. As soon as we understand our context and objectives explicitly, we can progress right back to starting with the human.

2. Define the Job to Be Done

There are actually two different types of Jobs to be Done. These are:

1. Main Jobs to be Done — the high level goal or outcome the customer has in mind

2. Related Jobs to be Done — the steps along the way that cumulatively add up to ‘enable’ the main job

Within both categories of Job to be Done, there are two types of each job These are:

Functional job aspects — the practical steps required to fulfill the job

Emotional job aspects — the feelings experienced, or desired feelings experienced, whilst completing the job

And lastly, the emotional job aspects are broken down further to reflect key intrinsic and extrinsic motivators. These are:

Personal dimension — the intrinsic motivation behind fulfilling the job

Social dimension — the extrinsic motivations behind fulfilling the job

Our objective is to use the Jobs to Be Done Framework to develop a fairly granular understanding of the human need we’re trying to fill.

It’s also common for us during this step to take the insights from user interviews and map the customer journey. This not only serves design purposes, but is also an effective communication tool that can assist our team in empathising with the human beings we intend to serve.

Lastly, it’s worth noting we’re big believers in starting the ‘design process’ early. If appropriate, we will have already gone through a lot of paper wireframes, put them in front of customers and worked through the iterations to reveal preference. These insights help from part of phase 3 and 4.

Once we believe we clearly understand the main and related jobs, we work towards identifying the importance of those jobs versus the satisfaction our customers have with existing solutions (ours, our competitors, or their own). We do this to try and determine if we’re solving a high value problem for the right people.

Not solving the right problem for the right people could definitely have a direct effect on Dan’s churn statistics, so lets keep going.

3. Is this a Problem Worth Solving?

As suggested in the quote from The Innovators Toolkit above, jobs rarely change. Yet the capability we have to fulfill jobs in new, meaningful and valuable ways is constantly evolving.

Because of this, it’s critical that we survey the landscape of competitors — the mechanisms through which a job can be fulfilled. This serves the purpose of understanding how satisfied people are with they way they’re currently fulfilling the job. This could be through a process they’ve designed themselves, it could be through another product or service, or it could be that there’s no effective way for them to fulfill the job, and they therefore neglect it to their detriment.

Regardless of the answer, this is something we try to discover as early as we can.

Whilst doing this, we employ the Problem Worth Solving (PWS) Framework to help us also determine how important a job is. What we’re aiming to discover is the propensity people have to hire a new solution. To put it simply, we’re actually assessing if there’s a genuine need in market for the problem to be solved or the job to be fulfilled. The last thing we want to do is waste time, resources and capital on a problem that isn’t worth solving.

Whilst the core of the PWS Framework is in action, we’d again want to be revealing preference by exposing customers to wireframe iterations based on the insights we’ve synthesised to date. We would also look to typically conduct desirability testing through various channels to determine if our refined value proposition is strengthening in the eyes of the real people we intend to serve.

The insights gathered during previous components of our process are then synthesised and tested further. Sometimes we’ll start this in a simulated (acting out the Job to be Done) environment, but as quickly as we possibly can, we want to get our optimised product or service into the wild to test its efficacy; how effectively it fulfills the high-value Job to be Done your customers chose to hire your product for.

4. Have we solved the problem and met the business objective?

Dependent on the context, there are likely two key measures that need to be closely observed. The first is product or service efficacy. This can be monitored quantitatively and qualitatively, and will at times, also be determined through inference. The second is the key business objective/s. Without grossly oversimplifying, in Dan’s case, this is likely something like the percentage of new paying customers, versus the number of customers that churn in any given period. Or rather, the CAC to LTV ration.

In reality, only you (or Dan) can truly determine what is and what isn’t success. Success is context sensitive.

So what are the key takeaways?

1. Desirability rarely sets our context, but the desire to create value for humans still drives our process

2. The tools we employ within the context of our process are completely interchangeable. We use what we believe will work given the situation. Best practice combined with a little intuition often wins

3. Effective design creates new shared-value. Without business value, human value will not be delivered

4. Effective design is a constant balancing act between what is desirable, viable and feasible

As is always the case, we’re putting this out in the wild so it can be read, reviewed, critiqued and improved. We’re looking forward to the conversation that follows.

Co-authored by Nathan and Bianca Kinch