Customer validation is a must-stop on each business’ path to a product. Whether you’re a startup or a corporation, testing the assumptions of your idea with the target audience you hope will buy it is absolutely crucial. Failure to do so adequately risks wasting so much time and money.
Any kind of shortcut is a risk — and customer validation surveys are the most tempting of shortcuts. Surveys are easier and take less time than in-person interviews. You push a button to send them out, hope some people will answer, check off the “customer feedback” box and go your merry way.
But when you’re at the customer validation stage — the point at which you should be testing whether your idea is worth pursuing at all — it’s important to take the time and make the effort to interview people face-to-face.
Surveys have their place, but it’s not here. Why not? Here are a few of the reasons:
Surveys don’t allow for follow-up questions
You can try asking open-ended questions but there’s no opportunity to follow up on provocative points or ideas the participant may bring up. The ability to dig deeper or to follow down an unknown path often times bring out the best insights and actionable ideas.
People censor themselves
Folks tend to write what they think is an acceptable answer rather than the truest answer. Or sometimes they don’t answer at all, fearing to make a written record of their opinions, which is what happened to an acquaintance of ours who tried to survey a group of HR professionals.
People tend to rush through surveys
Check the box and move on. Pick what seems like the right answer or answer briefly. No real thought required.
Nuance and emotion are undetectable
It’s tough to feel the emotion of a person’s response in a written survey. This is the most serious problem, because it’s in the nuance and emotion where the best ideas and solutions are found.
Surveys aren’t completely useless; they can be used at later stages to reconfirm certain types of hard data or to probe patterns that may have arisen in the face-to-face interviews.
Here’s an example: We had a client who was trying to come up with a product to help motivational speakers. He launched a survey and got some useful contacts, but we suggested he go back and conduct face-to-face interviews about curriculum. He asked probing questions about the difficulties running a motivational speaking business.
Over the interviews, about 8 or 9 difficulties came up. As a followup to those face-to-face discussions, our client sent a survey asking potential customers to pick which two of those 8–9 difficulties they found most pressing.
We didn’t use the survey to generate new learnings or findings, we used it to help us refine and prioritize what we’d heard.
So how many people do you need to speak with to get a valid sample? That’s hotly contested. But we think it’s good to talk to around 20 people in your core demographic.
With that many people you’ll start to hear the same answers, and patterns should emerge. If you’re not seeing patterns, you’re either talking to the wrong people, asking the wrong questions or missing some other nuance. Or maybe your whole hypothesis is off.
Sometimes surveys can actually mislead you
We worked with a client who had an idea for a service to help families after a loved one dies. This business would come in, go through the personal effects of the deceased, help organize it, decide what to donate and mediate any family squabbles that arose.
Because death is a sensitive subject, she decided to survey her target audience instead of talking face to face. Questions on the survey included things like: What was most difficult about going through the loved-one’s belongings? Did people argue over who got what?
The answers to the survey were disappointing to the entrepreneur. People didn’t seem at all troubled by family squabbles. She began to think that there was no need for this business at all. But when she went back and talked to people face-to-face, she got a different story.
Once she’d developed a rapport with interview subjects and was able to press with follow-up questions, she started hearing that it was in fact difficult and stressful to deal with a deceased loved one’s things and the bad blood that often arises.
Clearly, people didn’t feel comfortable admitting to this in a written survey but were willing to discuss it once they’d established some trust in a one-on-one conversation.
So you never know what might skew your survey results.
There’s an art to face-to face interviews, but here are some do’s and don’t’s.
- Do ask questions that evoke emotion. “Tell me about the last time you …” These kinds of questions often reveal frustrations or emotions or excitement, and that’s where the best ideas are found.
- Don’t interview friends and family. You’ll get bias and false confirmations because they’ll try to help you.
- Do ask people how they currently work around their biggest challenges.
- Don’t be too helpful. Sometimes when an interviewee stops to think, an interviewer will try to help by saying, “Do you mean …?” The moment you do this, you’re introducing bias.
- Do be OK with silence.
- Don’t forget to zoom out. Don’t go into interviews with such specific questions about your own assumptions that you fail to see or ask about bigger problems the person may be having.
- Do ask for real-world examples of the challenges they tell you about. Ask for more than one.
- Don’t hijack your own interview. You shouldn’t pitch your solution in that first interview. If someone naturally comes to the solution you’re following, that’s not your cue to say, “Actually I was thinking of the same thing” and start brainstorming.
- Do ask for permission to come back and ask the person more follow-up questions at a later date.
- Don’t forget at the end to ask for a referral of other people you can interview.
The most important thing to remember is to get out of the building and start asking questions (the right way). Happy validating.