Designing Approachable Surveys
Case study on designing accessible lifestyle surveys
Product Strategy | UX & UI Design | Prototyping & User Testing
Designing life events surveys
Surveys provided another method for Life.io to gather personal health data about users through their health and wellness application. Users had been providing their physical health metrics (drinking fluids, workouts, diet and food, sleep, and illness), however, there was a gap for mental wellness. The application was missing out on providing resources for wholistic wellness such as stress, work and career, mood, or family life. By launching a series of optional surveys, Life.io could quickly gather data and provide additional actionable insights for users to improve upon their wellbeing.
- Build trust so users will provide sensitive personal data, such as marital or employment information
- Gather meaningful and actionable data to build insights
- Provide a delightful survey experience through emotional design
As a small design team at Life.io, I was charged with the entire design lifecycle, including user research, prototyping, feature planning, illustration, and prototyping. I follow a consistent process for each new feature, which I have outlined in this case study.
In this case study, I review:
- Motivating users to complete multiple surveys
- Creating emotional design in surveys
- User research on completion rates, UI patterns and survey fatigue
- Lessons learned about survey systems
Motivating users to complete multiple surveys
Using the behavioural economics model of points and rewards, users earn points by completing surveys.
Surveys had 5 main sections from when they were introduced to a surveys index screen that could be referenced later. This sections included an Dashboard card, an introduction screen, survey question set, completion, and surveys index.
Why are we asking this question?
To help ease the issue of sensitive information, each survey question included a disclaimer of how Life.io would be using this data. Users were also given the option to skip questions which they felt uncomfortable asking.
Emotional design in survey experiences
How can we add a human touch to surveys design? As discussed at length in Aaron Walter’s “Designing for Emotion”, as Designers, we have to think about how to add an element of delight to form design. When designing the survey patterns, I considered how Life.io’s forms could have a more personal touch.
Adding delight with smooth transitions, fade and hovers
- Use smooth transitions between questions to add input feedback so that users feel that their data submissions results in an appropriate feedback loop
- Include a progress indicator to show the percentage completion of the form to avoid abandonment
- Validate acceptance with accessible form input types
- Use consistent and approachable illustrations for success, such as on a completion page or survey index page
- Allow for consistency within the system by creating a library of survey patterns (checkboxes, radio buttons, likert scale, slider)
- Use hover states with action colors and background patterns to clearly indicate the selection and minimize user errors
Chunking related information
To increase form completion, we explored using the chunking method to organize similar question types within the same form. By grouping similar questions, we could guide users into perceiving information more easily. The method seemed best applied to the likert scale question type, since users were asked to scan the question scale and then select their answer. By grouping the questions in to two or three questions, users can scan and select their answer quickly.
USER RESEARCH ON COMPLETION RATES, UI PATTERNS AND SURVEY FATIGUE
Using a rapid usability testing method, I was able to validate assumptions about completion rate and survey patterns for each question set. I wanted to understand if participants were familiar with the surveys patterns and felt comfortable completing the surveys in a reasonable amount of time (~5–10 minutes) in one session.
Usability test: completion rate
User research suggested that the majority (90%) of users could successfully complete the survey test tasks.
Usability test: button placement
Life.io had clearly defined guidelines for button states, but needed further validation on the button placement. Placing the positive button on the right of the destructive button had increased visibility. The “next” button would likely be to the right of the destructive button, leading the user to the next screen.
Which of these button placements do you find to be the easiest to select?
Chose right-aligned buttons
Usability test: survey fatigue
We wanted to explore the optimal length of the survey questions. I had my assumptions that participants would prefer a shorter set of questions. However, when presented with the three options below, the majority of participants (65%) prefered the longer list because it offered an extensive list of options and participants didn’t want to be limited or forced to choose an option that didn’t accurately represent their background.
Which survey do you think you’re most likely to complete?
Chose longest survey
“These have the most options available. I hate it when I have to type in my own answer after clicking ‘other’”
- Research participant
CONSISTENT SURVEYS PATTERNS
I created a few new patterns for the surveys feature — a slider and a likert scale. I documented these new components in the live style guide for future use. Since the product team had recently defined the brand style, I was additionally able to create a consistent illustration style guide to document the product illustrations for this feature.
Lessons learned about surveys systems
There were a few lessons learned during the development of lifestyle surveys. While I validated assumptions throughout the design process, there was more that could have taken place prior to feature ideation that could have smoothed the design lifecycle.
When asking users for sensitive personal information it is essential to build trust with users and confirm the use of this data.
Clearly communicate to users how their actions are adding up to a greater goal. In this example, users could see progress validated on a progress bar, earn points for completion and receive a visual confirmation on a success page. Their results were then documented in a surveys index page which they could reference at any time.
Avoid information overload
We initially scoped the feature to include 7 surveys, but narrowed the focus to 3 main surveys for the MVP. By limiting the options for users we could reduce decision fatigue.