The Top 5 Questions About Trusting UX Research

You’ve got (user) data.

Here are the top five questions I get from startups, VCs, and Fortune 500 companies alike — and some helpful answers — about why and how you can trust user experience research.

Q: I’ve heard that you shouldn’t listen to your users directly, or that you shouldn’t build what they ask you for. Is this true?

A: Nope, this is not true. It’s time to let go of that infamous (and contested) Henry Ford quote “If I had asked people they wanted, they would have said faster horses” as an excuse to not talk directly to your customers. There is actually a lot of insight packed into ‘faster horses’! You should absolutely listen to your users. Listen to them directly and often.

It’s how you interpret what they say and how you understand your users that matters.

The CPO of an Internet company with over 40 million users told me a little while back, “No one has ever won by not understanding customers.” How true! As a product developer or designer, it’s your job to understand deeply what your users and customers need. Although it’s true that most people can’t always TELL you directly exactly what is that they want or need in technology (they are mere mortals, after all), there are helpful techniques to try to understand what the needs are behind what your users are saying. But these insights aren’t usually handed to teams easily. Instead of throwing your hands up and saying it can’t be done, try out some thoughtful user research to help you measure and interpret both behaviors (what users do) and feedback (what users say & ask for) to understand the full picture of users’ needs in order to serve them. Everyone deserves to be heard.

Q: Why should I believe the user research findings gathered from only 8 users?

A: Research has shown that the most pervasive usability obstacles will present themselves within 5–8 user tests. Repeat, iterative studies involving smaller sampling has since become a usability testing industry standard in gathering the most critical usability information within limited resources. I’ve personally been using this method for over a decade, and it has worked every single time. Also see my prior blog post concerning qualitative uses of small samples (spoiler: it’s about seeing patterns).

Q: In my personal experience I’ve seen something very different than what the user research reports. What should I believe?

A: Each perspective and observation about the world around us is valid and valuable, including those of people building products and applications. However it is important to remember that if you are involved in building something you represent a truly unique emotional and practical connection to a particular design or decision. The practice of user research focuses on gathering use data from actual customers, and in the process reduces bias that can be introduced by the individual perspectives of team members.

Q: What if a User Research report presents a finding that appears to conflict with prior or other research?

A: Different research studies usually involve different goals and frequently yield varying findings or insights. Most likely, the prior research was collected using a different method or had a different goal in mind. A common example is a marketing organization studying customer perception of a brand, versus a product team studying how users use a product. Unless the exact same study was conducted at the same time with different results, you needn’t worry. Focus instead on gleaning the insights that are unique to each study. If you can, consult an expert researcher to try to understand why certain findings overlap.

Q: If a user research finding does not also appear in my customer service data, should I believe it?

A: Yes. Both are equally valuable, considering their differing goals and contexts. Neither user research or customer service data will be 100% comprehensive and cover all possible product uses or user issues. CS data will enable you to focus in and get great detail concerning critical bugs like functional or quality problems, or billing flow issues, whereas user research data usually gathers information concerning specific uses, preferences, and obstacles within the overall product experience.


Originally published at userlens.com.

A single golf clap? Or a long standing ovation?

By clapping more or less, you can signal to us which stories really stand out.