De-risk innovation : suggested answers in questionnaires

Luc-Olivier
Design Tips
Published in
3 min readNov 24, 2021

Tip #4: The real individual’s answer is his own answer. Not the one we have suggested.

Technical note on Tip #4

Perhaps the reason for suggesting answers in questionnaires is to simplify or automate the analysis of responses. It is true that the use of open-ended questions that do not suggest an answer requires a lot of work to analyze the answers: they must be grouped by category, trends must be found, misinterpretations must be avoided, etc.

Originally, this practice did not come from the world of Design or UX Design (User Experience Design) but from the world of Sociology, Marketing, Politics…

When User Centered Design and UX Design approaches became popular after 2010, they naturally also used tools from other professions to be “connected” with people or users.

But when it comes to solving users’ problems and making them happy with the designed solutions, it is necessary to know exactly what they experience and think WITHOUT ANY INFLUENCE.

These techniques which propose predefined answers are certainly faster but they introduce biases.

And we must also add that they also introduce biases in Sociology, Marketing or Politics.

What happens with the proposed answers?

The problem with proposals is that we focus people’s attention on things that might not be important to them. Because they are generally willing to help and are not confident enough to say that our proposals don’t fit their thinking or experience, they will respond with our proposals even when it might not be their top list at all.

We have a first bias here.

To solve this problem we have learned that we must imagine all possible answers. But what we imagine, even what we think is the broadest possible list of answers, may be completely off the mark.

Here we have a second bias.

And what about the user experience which is made of actions AND sensations/emotions? We can try to translate it with stars and “likes” or with “very bad”, “bad”, “not so bad”, “pretty good”, “good”, “very good”.

But we simply do not give people the possibility to express by themselves how they formulate their feelings/emotions.

Here we have a third bias.

What about traffic statistics and tracking? These data are useful until we try to interpret them too much. A 60%* hot spot on an application or a website does not reveal what people think.

If we extrapolate this hot spot, it would be a fourth bias.

So how do we proceed?

Be prepared for a less simple job, one that will require you to study each response, each reaction, their context, but that will be much more revealing of what your users have understood and imagined.

Use as many open-ended questions as you can with people facing the interviewer who can pick up on state of mind, thought and precise phrasing and who can also pick up on facial and behavioral indicators of feelings/emotions.

  • A hot spot is for example a button that is clicked by a majority of people.

.

Luc-Olivier Lafeuille

linkedIn: https://www.linkedin.com/in/lucolivier/

Drawing of Julie Lafeuille
linkedIn: https://www.linkedin.com/in/julie-lafeuille/

All the tips on the Medium publication “Design Tips” or in French “Conseils de Conception”.

All the technical notes on de-risking innovative projects on the Medium publication “De-risk innovative projets” or in French “Dé-risquage des projets innovants”.

--

--

Luc-Olivier
Design Tips

User Centric Addict Product Designer, UX Designer, UI Designer, Graphic Designer, Innovator, Startuper. https://linkedin.com/in/lucolivier