My top takeaways from Jared Spool’s ‘UX Strategy Playbook’ Workshop.

Totally stoked with the workshop.

My colleague Jason Zhou and I went to Jared Spool’s UX Strategy Playbook Workshop last month.

It was excellent. I highly recommend you take a colleague with you when you go as the majority of the content is very practical.

I’ve summarised my top takeaways below.

To design a successful product we need to understand our customers—

To increase our chances of creating a really great product, we must intentionally design a working environment where, by default:

  • Everyone is focussed on solving customer problems.
  • We understand these problems deeply, based on first hand observation.
  • Problems are prioritised based on how impactful they are to our customers and business.
  • We are realistic, and take appropriate action, based on our level of confidence.

—so our whole team must become researchers.

The most effective way to encourage this environment to develop is to scale up the research capability of the team and company overall, so that everyone can help us to reach the clarity we need.

We should ask four questions:

1. Are we framing our work in terms of solving customer problems?

  • Are we investing enough time in up-front exploratory research to understand how people experience our product and solve their key problems today?
  • Have we mapped the existing customer experience, and do we use this to plan a roadmap that is centred around prioritised customer problems?
  • How can we focus the work of our product teams on end-to-end customer experiences, not feature sets or product lines?
  • What small tricks can we use to reinforce this thinking—like naming our teams after the problem they’re solving for customers (the core JTBD)?

2. Is the whole team actively participating in research?

  • Are we encouraging a high level of first-hand exposure to customers? Across executives, subject matter experts, and team members — does everyone have the opportunity to directly observe customers using our product? How are we up-skilling our SMEs and stakeholders to ask better questions when they do interact with customers?
  • To encourage active observation, can we give team members specific jobs to do during research and synthesis that leverage their individual strengths, such as watching for product complexity or missed expectations?
  • Have we involved the team in defining a research-based, 5-year vision for the experience they’re building? This memorable and shareable narrative should be used to help teams make the day-to-day tactical decisions required, confident they are still continuing to head towards a transformative experience.

3. Are we prioritising each customer problem we could solve based on relative scale and impact?

  • Have we had frank conversations with our leadership to identify what business objectives our company has? Can everyone on the team understand and communicate how these objectives map to the customer goals and problems we’re trying to solve for?
  • How are we synthesising and summarising the research insights we are gathering so everyone is able to share what they have learned, and our insights about customer problems aren’t anecdotal or biased?
  • When prioritising, are we gathering enough data to evaluate the scale of impact of solving the problem using both a business and customer lens? Have we quantified the areas of frustration/opportunity we have identified? How many customers are impacted by this; how much does this impact our business goals?

4. Are we being transparent about our level of confidence in our knowledge about the best solution?

  • Before we do anything—before research activities, before user testing, before shipping work—are we recording up front what results we think we’ll see? Clearly stating our our hypothesis beforehand will highlight clearly how our assumptions and inferences are so often totally wrong. This will humble us, and encourage us to focus first hand customer exposure.
  • We can always infer something from the available data we gather — but when is this an inference, and when is it an actual observation? Are there inferences we need to turn into observations, by studying first hand what customers are doing?
  • When we have decided on a particular solution, what real indicators of performance should we be tracking, directly tailored to the problem we are trying to solve?

This is my focus for 2020 and my biggest lesson from the workshop—

To create an environment that fosters great products, the most impactful thing we can do is increase the research capability and customer focus of our team, and our company.

--

--

--

I design software that helps businesses evaluate and improve their products, services, processes and safety.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

2021 Fashion + Design Trends and How to Apply Them

“Alexa, how do I mimic your app perfectly?”

Basic understanding of Heuristic Evaluation

How to use VRoid Studio

Cassina: One of the Top Italian Furniture Brands

UX spring cleaning: Decluttering your UI

Lenovo — Heuristic Evaluation

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Timothy Greig

Timothy Greig

I design software that helps businesses evaluate and improve their products, services, processes and safety.

More from Medium

Weeks 6, 7, 8. New features, testers, and optimizing tables

6 Tips for a new Product Designer

The best products out there are not built on Intuition, but Insights

Design challenge 101

A picture showing a notebook, a fountain pen and a pair of glasses (Photo by David Travis on unsplash.com)