7 mistakes in 1 survey question

Joseph Kay
Designing Lyst
Published in
7 min readNov 9, 2017

I’ve always found it useful to see examples of how not to do something. You learn a lot from your own mistakes, so it’s even better if you can learn from everyone else’s mistakes too.

A company sent me an email recently containing a one-question survey that they had run.

I won’t name the company that sent it. I’m pretty sure that they weren’t conducting serious research — it looks like this was just a bit of fun for them. Still, my apologies to them for picking on them like this, even if they are anonymous.

I’ve spotted seven things in their research design and their analysis that I think would be important mistakes in a real piece of research. I’m going to go through them one by one.

Here’s how they presented the question and the survey results in the email:

Take a moment and see how many mistakes you can spot. I’m not talking about typos or anything like that.

Ready? Here’s what I found.

1. Wrong question format

Every survey question should have a purpose. They should be designed to help meet a research aim.

This is the question they asked:

“Would you rather buy from a brand you know or buy the cheapest product?”

I don’t know for sure what the research aim was, but I think it’s fair to assume that it’s about uncovering the relative importance of brand familiarity and price.

So this is really two questions masquerading as one. It’s asking how important brand familiarity is, and it’s also asking how important price is.

Why is this a problem? By trying to do too much, the question can’t accurately record information about either subject, let alone both of them.

If someone indicates that they would rather buy from a brand they know, does that mean that brand familiarity is really important? Maybe. Or maybe it means that they prefer not to buy the cheapest brand. There’s no way to know.

Shoehorning it into this “would you rather” format obscures other possible data too. What if someone doesn’t care about brand familiarity or price? They would have to just pick one option, and their choice would be indistinguishable from someone who cared deeply.

It would make much more sense to have two separate questions that measure how much people care about each one. However, for reasons that I’ll come to, I don’t think a survey is the right approach at all.

2. Uneven comparison

This one is more subtle.

If they want to know relative importance of brand familiarity and price, the wording of the question should present the two options fairly and evenly. Otherwise, the results of the survey won’t represent the underlying issue well.

Why then are we comparing any known brand with the single cheapest product? That doesn’t seem even to me.

It would make more sense to compare the product from the most familiar band with the cheapest product, or perhaps any known-brand product with any cheap product.

3. High likelihood of bias

There’s a much bigger problem with the wording.

We are emotional creatures. We like to think of ourselves in a certain light.

When asked whether we prefer known brands or the cheapest product, there’s a good chance that we would feel more comfortable selecting the former, whether it’s true or not.

There’s a stigma associated with buying the cheapest option. Even in an anonymous survey, we can’t always escape the effect of that stigma.

Is there a way of wording the question to avoid this problem? I’m not sure. I suspect that this subject really isn’t something that you can get accurate information about from a survey.

What would work better than a survey? If you’re worried about people misrepresenting themselves, it may be better to observe what they do rather than ask them about it.

4. High likelihood of misreporting in general

There’s another reason why a survey isn’t the right research method for this subject: people won’t know the answer to the question.

Do you know how you make decisions about brand and price? I personally would find it incredibly hard to tell someone which one has more influence on my decision-making.

People in general are very bad at explaining why they did something, or even what they did. How often do you go to the supermarket? How many times per week do you eat tomatoes?

Our memories are faulty and our powers of introspection are even worse. So when it comes to conducting research into these sorts of issues, it’s far better to observe what people do than it is to ask them what they do.

In this specific case, you might be able to run an experiment to determine the effects of brand familiarity and price. For example, you could expose shoppers to different brands at different price levels and record their purchase behaviour.

5. Overly broad scope

“Would you rather buy from a brand you know or buy the cheapest product?”

What were you imagining buying when you first read that question?

A brand you know of what? The cheapest product in what category?

They asked the question in general terms, but people don’t make decisions in general: they make specific decisions in specific contexts.

For example, the roles of brand and price might be very different for the purchase of tinned beans compared with leather sofas or travel insurance.

Furthermore, people’s feelings about tinned beans might change when they’re hungry. Maybe they would be willing to spend more. Maybe they would be more easily enticed by a premium brand.

We don’t know how the results of this survey would apply to any specific case. Without that, the results are pretty much meaningless. Maybe everyone was imagining beans, maybe there was a wide variety. We can’t know.

Let’s go back to the experiment where we expose shoppers to different brands at different price levels. It would be important to study a specific product category. We would also want to keep a close eye on other variables, like income or hunger.

6. Misleading analysis

This one might sound picky, but it’s arguably the biggest problem of the lot.

The title on the chart is:

“Brand always wins”

The data they have collected says that 69% would rather buy from a known brand. How is that evidence that “brand always wins”? Maybe brand usually wins, but not always. It isn’t even close.

When you’re communicating research findings, it’s important that people come away with the key things that they should know. It’s good to sum things up in a concise way, and they managed to make it very concise, but they got the message wrong!

This undermines the whole point of doing research. Communication is key.

7. Incomplete data visualisation

There’s a lot of people who abhor all pie charts, but this one isn’t so bad. It’s just comparing two percentages and pie charts do that quite well.

However, this particular visualisation has a problem. They only have a sample of 100 people, and while this isn’t a terrible sample size, it isn’t so big that they can get away with having no error bars to represent the confidence interval.

If you aren’t familiar with confidence intervals, they indicate the range within which the true number would most likely fall if the research were conducted with all the relevant people in the world, rather than just 100 of them.

In this case, a 95% confidence interval would go from 59.9% to 78.1%.

p is the proportion who chose “buy from a brand I know” and n is the total number of respondents (source)

That’s quite a big range and it’s worth showing in the visualisation. I’ve never seen error bars on a pie chart before, so perhaps it would be better to use a different kind of visualisation, like the stacked bar chart above (except with labels and things).

Bonus: Poor legibility

A small extra one for free!

Why did they write the labels in the same colours as the pie chart segments? That text written in peach is very difficult to read. This is an accessibility issue and also just an example of poor communication and design.

In the spirit of learning from our mistakes: what mistakes have I made? Have I missed anything? Is there anything I’ve got wrong? Please let me know in the comments.

Also, Lyst is looking for another user researcher to join the team. At Lyst we’re always learning, and we love good research design and analysis. If that sounds good to you, click on that link and send us an application.

--

--