Sorry, But Nobody Has Time For That
Have you ever participated in an in-person or online survey for school to get extra credits, for a business to provide feedback on customer satisfaction, or for friends, to help them out with a project? If yes, many of you have probably felt deceived halfway through the completion of the survey due to the mere realization that there were more questions than you had thought. And if you’re like most people who live a busy life, you were probably thinking of the simplest strategy to get through all the questions as fast as possible.
What is this likely to lead to (besides a yawn or two)? Logically speaking, it is most likely that when respondents speed through a survey, the quality and reliability of the data can suffer. But, some of you might be wondering if people actually try to zip through questions because you might know somebody who knows somebody who likes to thoroughly complete every question he or she is presented with (talk about persistence, am I right?). Sorry to break it to you, but your friend’s friend seems to be the exception rather than the rule.
A recent study investigating a random sample of approximately 100,000 surveys, wanted to understand how the length of surveys (measured by the number of questions) affects the time respondents spend completing the survey. They found that respondents take more time per question when responding to shorter surveys compared to longer ones.
Before I jumped to conclusions that long surveys will automatically yield less thorough answers, I found other factors that can have an influence such as the relationship between respondents to the surveyor (I’m sure if my best friend had asked me to complete a survey, I would do so, rigorously) and the relevance of the survey’s subject matter to the respondent (I would personally skim through a survey about flossing). However, the data shows that survey length is a key variable that tips the meticulousness scale downward. For surveys that were longer than 30 questions, the average time spent on each question was almost half compared to the time spent on those that had less than 30 questions.
Additionally, not only are people more likely to zip through a long survey, but also are more likely to ditch it altogether. Findings showed that there was a decline in tolerance related to survey length; the survey abandon rates rose from 5% to 20% when surveys took more than 7–8 minutes.
Nonetheless, using surveys as a method of data collection is too useful of a tool to be tossed aside, so how can surveyors lessen the extent of this length dilemma? After administering many surveys last year for a class project, I’ve come up with 2 easy tips for people who are struggling to determine whether their survey is too long:
- Make sure it is respondent-friendly. This means, minimizing the completion time by reducing the initial effort the respondents will need in order to do the survey, which will also reduce the tendency of survey fatigue. For example, you can provide a multiple-choice format with a small number of response options (if only, this was the reality for every academic assessment too).
- Respect respondents’ time since “time is money”. Base the number of questions on the time respondents will likely be willing to donate. For students participating in an online survey for extra credit, they expect to be answering questions for a longer period a time (I participated in an online survey that took about 45 minutes, but it was mandatory in order to receive full credits). Although, for customers, colleagues or pedestrians who are voluntarily participating, 5–10 minutes may be all the time they have to give so generate the number of questions accordingly! (and as the data above has illustrated, to be safe rather than sorry, try to stick with less than 30 questions)