How to Run Live User Testing, Part 3: The Debrief

Turning user feedback into actionable insights

Brenden Mulligan
LaunchKit Library

--

This is the final post of a three part series on how we run live user testing at Cluster. After months of constant iteration, we’ve paused to focus on testing before the next release of our iOS and Android apps. Follow us on Twitter at @cluster.

The first post focused on getting the test setup, which includes deciding on a specific thing to test, deciding when and where to do the user study, deciding what type of users to study, recruiting participants with Craigslist, trimming the candidates list, prioritizing and scheduling candidates, and getting the right equipment.

The second post focused on actually running the tests, which includes arranging the room, meeting the participant, introducing the study, not revealing the answers, simulating app discovery and installation, walking the user through the prototype, and wrapping up.

This post will focus on taking all that amazing feedback you just gathered and parsing it into useful, actionable intelligence.

One more time, I want to give a huge amount of thanks to Michael Margolis and the Google Ventures design team, who taught us most of these techniques.

Debriefing

So what next? You have a set of recordings containing incredible user feedback, and now it’s time to parse and process it so you can turn the insights into action items for your product. Here’s how to pull out the good stuff and get your team on board with next steps:

  1. Schedule viewing party
  2. Distribute materials
  3. Capture insights and ideas
  4. Combine everyone’s insights
  5. Identify patterns
  6. Bucket patterns according to app sections
  7. Address the problems & start again

Like the first two parts of the study, this takes time, but it’s very worth it in the end.

Schedule viewing party

The first step is to schedule a time for the company’s key decision makers to review the videos. Including everyone might sound like a waste, but it’s very important. I know because we didn’t always do this.

When the whole team sits around, you have a much bigger opportunity to understand what feedback is important and what is worth ignoring. You’ll pull a lot more out of the sessions.

Reviewing the videos with the Cluster team

So, even though it seems like a lot of time, schedule time for the core team to watch all the videos in their entirety. Don’t skip this, or you’ve wasted all the time prior.

Distribute materials

While watching the videos, the goal is for the team to capture insights and ideas from what the user is saying and doing. The tools we use to do this are:

Tools for capturing thoughts and ideas
  1. 3x5 inch post it notes
  2. Legal pad
  3. Regular pen
  4. Thick marker

Capture insights and ideas

We’ve found it helpful to instruct everyone to capture two kinds of things on paper while watching:

  1. Important insights (good and bad) learned from the user on the notepad
  2. Any specific implementation ideas on post-it notes
My notes and ideas from 5 videos (slightly blurred for confidentiality)

The majority of the writing will be insights from the user and should be captured with a normal pen and paper. Examples of insights are things like

  • “Blue button really works”
  • “Really understood intro page”
  • “loves design”
  • “had no idea what to do first”

Everyone should write down everything they notice from what the user is saying / doing.

If the insights lead a team member to have a specific idea of how to fix / improve the app (ex. “Add ‘We will never post’ under the Facebook login button”), have them write each down on their post-it notes using a thick, black marker. Getting these ideas down on paper will let the team member focus on the rest of the study instead of trying to remember their idea.

Combine everyone’s insights

After watching each video, have someone stand at a white board and write down the insights everyone captured. Use a green marker insights for things that worked, a red marker for things that didn’t work, and black for general insights. Only write down things that more than one team member heard. The purpose of this step is to distill everyone’s page of notes into a common list of insights to address later.

The board after one of our debriefs (slightly blurred for confidentiality)

Create one column of notes for each study participant. Then, somewhere off to the side, have everyone put up their post-it notes in no particular order.

Identify patterns

Once you’ve watched all the videos, and everyone’s insights are up on the board, it’s time to find patterns. It’s not worth fixing every small issue each person had because all users are different. The goal of the entire exercise is to figure out the reoccurring problems that might also effect users in the wild.

With a new color, circle or draw a dot near any insights that come up more than 2 times (or more than 40% of the time depending on your number of participants). Do this for green, red, and black insights.

Bucket patterns according to app

Now that you know what common insights came from testing, it’s useful to categorize what part of the app the insights affect. With Cluster, our users were testing four sections:

  1. Pre-Registration Screens (including App Store marketing)
  2. Creating & Naming Initial Cluster
  3. Inviting Friends
  4. Uploading Photos & Exploring App

On another whiteboard, we made a column for each section. Then, we went through the first whiteboard and moved each recurring pattern over to the appropriate column.

Common insights and patterns (slightly blurred for confidentiality)

Suddenly, hundreds of insights were distilled and organized according to the part of our app they affected. An actionable list emerged.

We then moved the ideas that had been posted underneath the insights. What we found was that most negative insights already had some great suggested ideas for improvements.

Address the problems & start again

Now it’s time to get to work to improve the product to address the common negative insights you found during this process. Then, start over and do the whole thing again. Keep going until you’re satisfied that the problem areas aren’t too big of a deal.

You did it!

You’ve now completed the entire user testing journey! I know it’s a lot of work, but once you do it a few times, you’ll start to see problem areas disappear and users start to understand your app much faster. It’s invigorating. I’d encourage you to trust the process and see what happens. I guarantee your app will become easier and more delightful to use.

Thanks for reading this! If you got value out of this article, I would really appreciate you hitting the recommend button below. And as always, please reach out on Twitter @mulligan if you have any questions or comments!

--

--

Brenden Mulligan
LaunchKit Library

Builder. Head of Product for @JoinCommonstock . Helping podcasters with @PodpageHQ . Past: @Google (@LaunchKit acq), @Cluster , @FrontlineFoods .