Supercharge Your Survey with User Research

Use qualitative insights to design an impactful survey

When you need to answer a specific question at scale, surveys might be the tool for you. Photo by Edwin Andrade via Unsplash.

It seems like a simple question: how do you make your money? If every two weeks you get a paycheque, it’s easy to answer. But what if you have a side hustle, or you make some extra cash from ads on your website?

At FreshBooks, we know that our customers make their money in all kinds of ways—often, in more ways than one. To help them track it, we needed to figure out which ways were most common. And to do so, we decided to run a survey.

This wouldn’t be easy. Our customers don’t all speak about their revenue streams in the same way. Running a survey came with a high risk of customers not understanding the questions. (Not to mention the pervasive risks of running a survey, period — I see you, survey skeptics. Bear with me!)

Knowing this, I decided to reach into my UX toolkit to leverage some familiar qualitative methods to improve our survey and ensure it gave us meaningful data.

Disclaimer: Like any tool, surveys have their strengths and their limitations. A lot has been written about the risks of surveys — in particular, if you’re considering running one, start with this piece by Erika Hall.

Start with user interviews to inform survey language

Everyone talks about their business in their own special way.

You say ad revenue, I say affiliate income. That’s how the old saying goes, right?

Before we ran the survey, we interviewed customers and learned the language they use to talk about their revenue streams. We spoke to 5 customers remotely, and reviewed emails from several others.

This allowed us to make an educated first guess at writing the survey in our customers’ words. The language of our questions and the answer options were informed by this qualitative data.

If this sounds like a lot of work, take note — you may already have the data you need! In our case, we had already run some interviews and were able to leverage those recordings to take a first stab at a draft of the survey.

Run a remote pilot study with users

Is this gif relevant? Vaguely. Was there any chance I was going to publish this post without it? None.

I once had a professor tell my class that we would be completely insane to run a survey without piloting it. And while I cannot say for certain that I am not at least a bit insane, I am most definitely not completely insane, so I always pilot my surveys, and you should too!

After creating a first draft based on our initial interviews, we ran a small pilot study through remote user interviews with 5 customers.

We asked participants to do the survey and think out loud, and then we reviewed their answers with them to see whether they understood the questions. We iterated on the survey copy after each one.

It’s important to stay quiet while the customer takes the survey, and focus on testing the survey. I know it’s tempting to dig into their answers, but you are trying to learn if the survey is going to give you good data — not about that customer’s specific workflow.

The main function of the pilot is to help you edit. Don’t forget to kill your darlings (and by darlings, I mean non-sequiturs, incomprehensible rating scales and anything that makes your customers’ faces become this emoji: 😰).

Not clear what you can learn from a question? Channel Edward.

When we initially wrote the survey, for each question, we called out:

  • What are we trying to learn?
  • How will we use this data in our design work?
  • What assumptions and/or biases are inherent in this question?

With the pilot study, we were able to validate whether we could really learn what we wanted to from each question based on how users interpreted them. This helped us to ensure we got the data we needed by removing questions that we couldn’t learn from and refining those we could.

(On that note, one piece of practical advice: consider avoiding long form field answers if you can. It’s hard to review the data and learn from it. In our pilot, we saw that people will just put a space and ignore it at worst, or, often, write something you can’t really understand without talking to them. If you need to use them, be sure to test open-ended questions through a pilot study to see whether users are able to interpret them.)

How I imagine people answering long-form survey questions.

The survey is done — time for more research!

Our survey answered our original questions around how our customers make their money. But naturally, these answers opened the door to a host of new research questions.

This is because there’s a limit to what we can learn from a survey about the why — why do our customers need this feature in context of their day-to-day workflow? (See, I haven’t forgotten you, survey skeptics!)

As design continues, the survey isn’t just a means to an end — it’s the beginning of a longer conversation with a rich pool of research participants. So don’t forget to ask if you can follow up! We tested prototypes with survey participants and took the opportunity to go over their survey answers with them to figure out what went unsaid.

Like many research methods, surveys work best when you pair them with qualitative conversations—both when you initially build them and when you analyze and use the data.

So, if you’re thinking of running a big old survey—fear not! Dipping into your UX toolkit and collaborating with customers can make surveys a powerful, essential part of your research process when you need to learn at scale.

S/o to Irfaan Manji, Product Manager, who I collaborated with on this survey.