The impact of incentives on SMS/text message survey response rates on an mHealth platform in Southern Africa

Charles Copley
Patient Engagement Lab
5 min readSep 29, 2019

AUTHORS: Charles Copley, Eli Grant

Photo by Jen Theodore on Unsplash

Surveys are often used to get feedback from users of a service. However, the kinds of people that respond to surveys might not be the kind of person whose feedback you really want! This is known as selection bias. Reducing this bias requires increasing the proportion of people that respond. One way to do so is to offer financial incentives. Another might be to vary the length of the invitation message. Perhaps someone is more able to process a shorter message, making them more likely to respond to these?

In order to test the effect of this we put together an SMS experiment that invited mothers to participate in an online survey of a mobile health platform in South Africa. This experiment randomly assigned invitees to different forms of incentivization, both in mode (i.e. guaranteed fixed amount vs lottery) and in amounts which ranged from R0 to R50. We expected that the incentive amounts would increase response rates, however we also needed to assess how the incentives might impact the accuracy of the survey responses. In addition to the incentive amounts, we also randomly allocated people to receive different length invitations to participate.

In order to test how incentives effected the validity of a person’s response, we embedded three different questions with known answers into the survey.

The questions we chose were:

  1. In which province did you register? (there are only 9 provinces in South Africa so this is a very easy question to answer correctly, and the response does not say anything personal about the person.
  2. What is your age? This is a more identifying question that users may not want to answer if the user does not trust the service. This tests how likely a user would be to answer identifying questions correctly.
  3. Who registered you on the service? On this health platform women are registered by mobile phone. This is done either on their own phone (with assistance from a health care worker) or on the health care workers phone which may be done at a later stage based on a written register provided by mothers in the waiting room in order to save time. This is a more difficult event to remember, and so tests recall bias of survey responses.

The experiment was designed as shown in the diagram below:

Incentive Type

For this study we randomly allocated participants to receive differing types of incentives.The difference is clear from the invitation questions given below:

  1. None

Is the service helpful to you? Please take a survey to help us improve the service. Your identity will stay private. Answer the survey questions by replying with the number that matches your choice (it’s FREE). If you have any questions about the survey, reply to this SMS. Want to start the survey? Reply ‘JOIN’.

  1. Lottery

Is the service helpful to you? Help us improve by taking a quick survey (it’s free). By participating you stand a 1/5 chance to WIN R50 airtime! Your identity will stay private. Reply ‘JOIN’ to start.

  1. Fixed

Is the service t helpful to you? Help us improve by answering 8 quick questions. When you finish we’ll give you R50 airtime! Your identity will stay private. Reply ‘JOIN’ to start.

As can be seen below, we found that the Fixed invitation produced significantly higher response rates as well as completions. What is more, we found that the lottery invitation did not produce a higher response rate than a non-incentivized survey request.

Incentive Amounts

In this study we randomly allocated people to receive differently communicated amounts as well as differing airtime incentive amounts (between R5 and R50 as seen in the diagram below.

As shown in the diagram above, the lottery style did not increase the response rate with increased incentive amounts! On the other hand, we did find that the response rates increased when increasing the fixed incentive amounts. The rate of increase is not linear over this range.

Invitation Length

A final test that we embedded into the survey was whether the invitation length affected the response rate. We had two invitations of this form e.g.

  1. Is the service helpful to you? Please take a survey to help us improve the service. Your identity will stay private. Answer the survey questions by replying with the number that matches your choice (it’s FREE). If you have any questions about the survey, reply to this SMS. Want to start the survey? Reply ‘YES’. (54 words)
  2. (FIRST MESSAGE) Is the service helpful? Please help us improve by answering 8 questions for FREE. Your identity will stay private. Want to start the survey? Reply’YES’. To skip the survey reply ‘NO’. Replies are FREE. (33 words)
  3. (SECOND MESSAGE) You will now receive the questions. Answer by replying with the number that matches your choice (it’s FREE). If you have any questions about the survey, reply to this SMS with your question. (33 words)

As can be seen above, there is weak evidence (p=0.054) that a shorter invitation is more likely to generate a survey response.

Effects on response accuracy

In order to test whether incentives had an effect on response accuracy we embedded three questions into the survey, that we already knew about the participants. The overall response accuracy rates to the different questions is given below:

We then investigated whether incentivization (of any kind) increase the rate of false response. This is given below:

We found that incentivization only had a significant impact on the question “How were you registered?” (X² = 4.6105, df = 1, p < 0.05),

Conclusion

When we started this experiment, most people at our organisation believed that incentivizing people to respond to surveys would heavily skew our results. We found that while this is true, it is not as strong an effect as most people had assumed. We also found that lottery style rewards were often the de facto method used in our services to increase uptake given limited budget. However our findings showed that lotteries have no effect on uptake of services.

It is always important to question assumptions.

--

--