A fivefold increase in survey responses — case study

Helen Calderon
Kainos Design
Published in
4 min readAug 29, 2024
A cartoon hand completing a survey. Out of three faces, the user is selecting the happy face by putting a tick next to it.
Photo by Mohamed Hassan on pxhere.com

As a standard part of delivering digital services for the UK government, we define and measure success. One part of this is measuring user satisfaction.

On a recent project the survey capturing customer satisfaction wasn’t getting much traction. There was a push to add a second survey to the service to increase responses.

Reviewing the current survey

Instead of jumping headfirst into creating a new survey, I took a step back. I reviewed the current survey to understand what was working well and what might need to be improved.

Getting survey responses follows a typical conversion funnel.

  1. Respondents need to see the call-to-action (CTA).
  2. From there they must get to the survey.
  3. Lastly, we get completed responses.

I reviewed each step and the conversion rate between them: exposure to the call-to-action, the journey from the CTA to the survey, and lastly respondents’ interaction with the survey questions and the completion rate.

A funnel divided into three stages. The top stage says “See the survey call-to-action”, the middle stage says “Start the survey” and the bottom stage says “Complete the survey”.
Survey conversion funnel

Alongside survey performance I checked how respondent privacy was being managed, and the survey’s accessibility.

My companion throughout this review was Caroline Jarrett’s book ‘Surveys that work’ — recommended reading for any UX professional working with surveys.

What was working well

The survey was hosted on super-powerful Qualtrics. This provided options for customising survey questions and logic, as well as automating analysis.

In terms of the conversion funnel, the survey was accessible from any page within the digital service via a site-wide banner. While the response rate was low, those that started the survey generally completed it. Lastly, the survey had been capturing customer satisfaction for several months, meaning we could track the metric, along with changes to the survey over time.

Improvements we made

Building in privacy and accessibility

While primarily a businesses-to-business service — nearly one in four working age people identify as having a disability (DWP). Defra’s accessibility team provides an accessibility style sheet that addresses known accessibility issues within Qualtrics. This CSS style sheet can easily be added or adapted to any Qualtrics survey.

Privacy needs to be built into any survey and planned for when managing responses. If using Qualtrics make sure to turn off the default IP address collection. Our survey didn’t ask for respondents’ personal data but some respondents include it anyway, so a plan is in place to regularly go into Qualtrics and remove or obscure this within the raw data.

Improving survey completion

Next we turned to improving the questions.

  1. Reduce respondent effort and fatigue

While short, the survey included 4 open text questions. These can fatigue respondents. Some of the open-text questions on this survey were profiling questions.

We reviewed the answers to one of these profiling questions and identified two themes in the responses. To reduce respondent effort we updated the question, presenting a short list of options for users to pick from, along with one option permitting the user to write in something if they felt the options didn’t describe them.

2. Leverage other data sources to answer questions

One question asked users how they found the service. We dropped this question as other tools and methods — in this case website analytics — could provide this answer.

Next, some of those who filled in the survey were employees involved in delivering the service. Instead of asking these respondents who they were, Qualtrics provides enhanced call-to-action tracking that can work this out behind the scenes. For this project, if users clicked on the survey call to action from a page accessible only to government employees, Qualtrics would automatically profile responses and omit the profiling question.

Increasing survey starts

After streamlining the survey to make it easy to answer, we looked at how we might increase responses.

After clicking on the link to the survey an interstitial page warned users they were leaving the digital service for the Qualtrics survey. Extra steps will increase friction and reduce the survey response rate so we removed this page.

Next, we added several new touchpoints for our shortened survey. When service users had contacted the service desk a closing email included a link to the service survey. We also added additional calls-to-action at the end of the digital service.

Streamlining analysis

Instead of siloed data across multiple surveys, one enhanced survey captured data from multiple channels and users.

To speed up analysis and allow us to quickly report against user group or touchpoint, we used Qualtrics features to help segment and categorise responses on the fly.

Using the enhanced call-to-action tracking, we automatically grouped participant responses by touchpoint whether this was the end of service, site-wide banner or the customer service desk.

We also introduced some survey logic that meant Qualtrics would categorise responses on the fly. We created a custom text set variable for example ‘task’ . Then as responses came in, Qualtrics worked out which task a respondent did and assigned a value automatically. These two tracking options enabled us to quickly segment survey data and report results based on user behaviour and survey touchpoints.

It doesn’t end here

While we managed to get a fivefold increase in responses, we continue to look for improvements and iterate on the survey.

For example, as responses have come in we’ve noticed that one of our new multiple choice questions was getting a lot of write-in ‘Other’ responses. We’ll repeat this cycle, review the responses and try new ones that better resonate with users.

--

--

Helen Calderon
Kainos Design

User researcher @ Kainos. Some of my current interests include mixed methods research, accessibility and research operations.