Survey’s Up!

An agenda for your pre-survey safety meeting

Erika Hall
Mule Design Studio
8 min readFeb 11, 2020

--

OK, so I haven’t convinced you to avoid surveys completely.

Well then, this is where I help you prepare your team to learn the most without getting lost at sea.

Why surveys are the most dangerous research method

To recap, I consider surveys an advanced technique because there are so many small nuances to their design that have huge implications for the results. Yet it’s so easy to throw one together using available online tools. And surveys offer no feedback mechanism to tell you whether you’ve done it well or badly. (“Well”, in in this case, means “good enough for you to learn something that reflects reality and is sufficiently generalizable that you can base important decisions on it with reasonable confidence.”)

Because of various cognitive biases, if you get enough responses to your questions, your brain will encourage the belief that you have incontrovertible hard data. This can happen even if the only creatures who respond are secretly eager chihuahuas who select answers at random by walking on their owner’s laptop.

Stinky garbage data can look just as pretty in a chart or graph as totally reliable, representative data. This is not a good thing.

Creating a good survey and interpreting the results accurately is much harder than the marketing materials for survey tools would have you believe. In my opinion, they should offer the same caveats as direct-to-consumer pharma ads. (Contact a research specialist immediately if you experience confirmation bias lasting longer than four fiscal quarters.)

I’m giving you this warning, not because I have it out for the makers of survey tools, but because bad data leads to bad decisions leads to bad design. And I want there to be better decisions and better design in the world.

Go Team Better!

The Great Persistent Survey Gyre

When you’re focused on your own thing, it’s easy to forget that what you’ve dedicated your days to represents merely a tiny sliver of someone else’s experience. This is true even if what you do is quite popular. (Even Despacito is only 4 minutes long.) You don’t control any of those other interactions, but they create the expectations your audience has of you, for better or worse.

Embrace this reality.

Because of how easy it is to create and distribute surveys, everyone is doing it all the time. This ads up to a lot of demands on the attention of your audience. Don’t make a grab for another scrap unless you’re ready to use it well.

The first insight you need is a sense of just how many other surveys, and of what types, the people you want to reach see on the regular. As rough proxy, you can use the number of surveys that find their way to you and to the other members of your team. (This is the only time I’m going to encourage you to generalize from your experience to user experience, so enjoy it!)

Your assignment: One week before designing your own survey, begin a diary of every single call-to-action to take a survey you encounter. This includes every email, website, app, phone call, drugstore receipt, and comment card that requests input or feedback. Log each one. Make a little scrapbook of the best and worst, like a mood board. A meh board!

A Twitter survey about the associations of various brands of ginger ale.
NONE. THE ANSWER IS NONE.

Even if you aren’t planning on creating a survey anytime soon, this is a good exercise. It helps you realize that 80% of anyone’s online experience is resetting login credentials and dodging nonsense feedback requests.

Have everyone participating in the survey design go through this exercise. Once, you’ve established that baseline, get together and run through this handy checklist. If you can honestly say yes to every item, you’re ready to go.

#1 You know what you need to know

Before you can write survey questions, you need to identify and agree on your research question. Is it a big question or a little question? Is it a qualitative (you need a description), quantitative (you need a measurement), or mixed-methods (what and how much) sort of deal?

Make sure everyone is clear on your standard of confidence. What will indicate that you have answered your question sufficiently well to move forward?

#2 You have determined a survey is the best way to answer that question

A survey is a survey. A survey should never be a fallback for when you can’t do the right type of research—Me, Just Enough Research, 2e

Any online survey is really a little app that encourages and allows the respondent to provide you with true information about themselves. You need to know a fair amount about your target population before designing an app that can do that.

Be honest. Are you running a survey because that is definitely the best way to learn what you need to learn? Or, is there some other pressure leading you to force-fit what you need to know into this particular format?

Set aside your choice of research method for a minute and talk about what the ideal answer to your research question—your overall objective—would look like. Which activity is best suited to yielding that answer? Reading existing studies? Conducting contextual inquiry? Running your competition through a usability test? Or, conducting a survey?

In the words of business consultant Ronald J. Baker, don’t stick a ruler in the oven to check the temperature.

#3 You know how you are going to use the results

Using a survey or questionnaire to generate some ideas for further research is much different from taking a census your organization will treat as completely representational and use as the basis for a major investment.

Be very careful about this. Once survey results exist, it’s very easy to treat them as a source of ultimate truth. The more you refer back to them, the truer they seem. Don’t rest weighty decisions on a flimsy foundation never designed to bear them.

#4 You have a clearly defined target population that you know how to reach

Unless you are simply sending out a menu to your meeting attendees to find out what they want for lunch, your survey is only going to reach a subset of the population you want to learn about. And an even smaller number of the people you manage to reach will respond. That subset is your sample.

The more your sample differs from your target population at large, the more sampling bias you are dealing with, and the less representative your results.

What is your plan for getting responses that truly represent the people you want to learn about? This means not only having enough people respond for it to be meaningful, but also a representative distribution. The stated purpose, placement, and appearance of your survey have a strong influence on the response it gets. And the factors that improve response rate tend to increase sampling bias, i.e. the people who have unusually strong feelings or a lot of free time will be more likely to participate.

#5 You know enough about the people you’re surveying to write meaningful questions

The great thing about doing live interviews is that you can rephrase your question if the participant doesn’t understand what you asked. It’s often also possible to tell when someone is just giving the answer that makes them look good. You can come back at the same topic another way or probe for more detail. You often learn things you didn’t even think to ask about. None of this is possible with a survey. It’s like sending a little canned interview into the world and hoping for the best. (Yes, you can design branching questions based on responses, but that is a whole other can of advanced survey design worms.)

Will the people you’re surveying be willing and able to provide truthful, useful answers? Most people know the number of pets living in their household. No one can predict their future behavior. And I doubt many people have the nuanced associations with various brands of ginger ale advertisers wish they did.

#6 You know how your research objective maps onto various types of survey questions

After all, we are interested in measuring the attitudes of the members of the group, not those of the experimenter.
— Rensis Likert, “A Technique for the Measurement of Attitudes” (1932)

Once you know your objective and your audience, you need to decide how many questions to ask, of what type, in which order. Even given the same basic wording and the same population, the type and order of your questions, and the context in which they are presented affects your results.

Closed-ended questions with mutually exclusive options are less work for the respondent to answer, but this requires you to know what all the potential answers are—not what you wish they were.

And if you’re using a scale (of 1–10, etc), not only does the wording need to be familiar and meaningful, but the type of scale you choose has to reflect your respondents’ mental models. Doctors may not typically think about stack ranking the features in their drug interaction apps. So asking them to do so out of context won’t give you real world information about their priorities or decision-making.

I’m not going to going into detail about all of the various question types here. I wrote a book that does that. And if you want even more detail, the Pew Research Center posts really informative stuff about methodology.

#7 You have a test plan

Remember, an interactive survey is an app with the function of generating useful data that represents reality. So you need to test it to make sure it’s capable of doing that before you release it into the wild.

A great way to do this is to recruit some people who represent the audience you want to reach and have them take the survey while speaking their thoughts aloud. This is the same cognitive walkthrough technique common to usability testing. You want to make sure that your questions are clear, meaningful, succinct, and…whatever the opposite of leading is.

Yes, this is a bit of work

Sure, surveys often seem easier on the surface than other types of research. It can take much more effort to get them right. The complexity lurks in the deep. Running “a quick little survey” is the “three-hour tour” of research. You might find yourself more lost as a result, especially when you consider the onslaught of surveys your customers face every day. The shortcuts to insights pile up.

The value of the data you gather is measured not in the quantity of responses, but in the quality of your decisions, so think of your future self and make sure you have what you need.

--

--

Erika Hall
Mule Design Studio

Co-founder of Mule Design. Author of Conversational Design and Just Enough Research, both from A Book Apart.