How to write survey questions that don’t suck

Have you ever given up answering an online survey because it frustrated you with irrelevant questions.

Alexander Baxevanis
Webcredible, part of Inviqa

--

Here’s how to make sure your survey respondents don’t give up too.

While leading UX projects, I sometimes have to plan online surveys (or persuade people not to run an online survey when I think that it won’t produce any helpful results).

So when I come across a website or email that prompts me to take a survey, I’ll often click on the link and attempt to go through the survey to see if I can learn from a good example or if I’m going to start banging my head against a wall. Most often, it’s the latter and a few days ago it was one of those days.

A charitable organisation I follow emailed me a link to a survey. As I felt pretty strongly about supporting that organisation’s cause, I decided to take a few minutes to answer. And then I came across the following question:

Which of the following devices do you most often use for Internet browsing?

- Computer tablet
- Desktop computer
- Laptop computer
- Smart phone

Looks like a pretty innocent question, doesn’t it? But in my mind, alarm bells started ringing and I instantly knew I’d come across a typical example of a poorly written survey question. Of course nobody sets off to write bad survey questions on purpose, so I was curious to find out what the survey creator sought to find out through that question. I got in touch with that organization, and their response was:

“we just want to know the best way to format the communications we distribute to our supporters”

A worthy goal indeed, but not one that this survey question will help you with.

Don’t ask questions whose answers you can already find from other sources

Every question you add to a survey takes time to answer, and the longer your survey the more likely it is for people to drop out before completing it.

If this organisation wanted to find out what sort of devices their current supporters use, they could have just looked at the analytics recorded on their website. If they wanted to find out what devices the general UK population uses, much more comprehensive & reliable surveys have been carried out and are available for free by Ofcom in the UK and Pew Research Internet Project in the US.

Just a sample of the free statistics available through Ofcom in the UK

A survey question asking about Internet access could only be truly useful if it was limited to a more narrow context, for example:

If we were to send you a weekly newsletter by email, where are you most likely to read it?

Don’t ask questions when you’re not sure what to do with the answers

One trick I always use to evaluate survey questions is to pretend that the survey is done, assign some random percentages to each answer and try to think what decisions I might be able to make.

So let’s imagine that the following answer came back from this survey question:

Which of the following devices do you most often use for Internet browsing?

- Computer tablet: 20%
- Desktop computer: 15%
- Laptop computer: 40%
- Smart phone: 23%

How can you use this data to influence your design decisions? If only 15% of people use desktop computers, does it really make a difference in the way you format your communications? Or is it just an interesting statistic to collect?

Make sure questions and answers make sense without further explanation

You may have noticed that one of the answers offered in our example question was “computer tablet”. You’ve probably guessed that the survey author meant tablets like an Apple iPad or Samsung Galaxy Tab. But guessing is not enough. The more people have to guess, the less reliable your answers will be.

Unlike face-to-face interviews, people completing an online survey don’t have the chance to ask you what exactly you meant. Neither can you ask them any follow-up questions.

If survey respondents encounter questions they don’t understand or don’t feel they can answer accurately, they’re again more likely to get annoyed and drop out.

If you’re not 100% certain that your pre-defined answers cover all eventualities, allow people to select an “Other” option and perhaps give an answer in their own words.

Test before you publish

The best way to avoid some of these pitfalls is to try out a draft of your survey with a couple of potential respondents. It’s important that the people you test the survey with have had nothing to do with creating the survey so they look at it from a fresh perspective. Only then they’ll be able to spot and challenge any assumptions you’ve made and give you useful feedback.

When you let somebody go through your draft survey, ask them the following:

“Was there any question that you didn’t understand or weren’t sure how to answer?”

“Did you have any doubts about why a specific question was being asked?”

“Did we ask for anything you wouldn’t like to disclose or could cause offence?”

Treat any feedback seriously and think of adding, removing or rewording questions to eliminate issues.

A well designed survey can give you meaningful insights.
A poorly designed survey will only give you false confidence.

Just because you have numbers next to an answer,
it doesn’t automatically make it more valid.

I work at Webcredible, a customer experience design agency. We use surveys amongst other tools to help companies connect to their customers and create people-centred, efficient and delightful digital experiences. Interested? Get in touch!

Header photo by Garrett Coakley (via Flickr/CC)

--

--

Alexander Baxevanis
Webcredible, part of Inviqa

Experience Design Director & Creative Generalist at @Inviqa. Cyclist, Photographer, Husband of @aniamendrek & Flâneur. Creator of @LDNfilmphoto, @cyclehireapp