Our team shares what they learned from India to help others initiating or expanding remote surveying methods
Collecting household information can be a very expensive and time-consuming activity. You need to sample units, build a questionnaire, hire a field team, train the team, manage data quality, and analyze the data — all of which can take months.
Now, imagine being able to mobilize a survey within days.
Data on Demand and remote data collection
The Data on Demand team at IDinsight is doing exactly that — making it easy to access high-quality and low-cost data, especially from hard-to-reach, low-income populations. Phone-based surveys provide a huge opportunity to make data collection faster, more efficient, and cost-effective. The Innovations unit of Data on Demand has been experimenting on a range of research questions to understand how to best carry out phone surveys. Since December of last year, we have been building systems and strategies to implement phone surveys at scale.
With the COVID-19 crisis preventing in-person data collection in many countries, phone surveys are more vital than ever to collect actionable data for policymakers. During the COVID-19 crisis, we are testing whether we can conduct many of IDinsight’s surveys by phone, on topics from maternal health to financial inclusion. We also launched surveys to inform policymakers’ COVID-19 response this week and will iterate on this approach to use phone surveys at an even larger scale. We realize other research organizations are making this shift, and hope our learning to date can be helpful to anyone who wants to collect quick, low-cost, and high-quality data for the governments and organizations that need it most.
Research, set-up, and context for mobile surveying
In-person surveying at the best of times can be expensive and slow. We started our phone surveys research with the end goal of replacing in-person surveys, where possible, with a tool that can be used any time a team needs to collect data quickly and at low-cost, or to target populations that may not necessarily be physically present in households (ex: migrant workers). We began our research with surveyors and respondents we’ve already worked with during in-person surveys, but plan to expand our scope.
Our work has primarily centered in India, where phone penetration is extremely high in both urban and rural areas. According to Omidyar Network, phone ownership has increased from 65 percent in 2017 to over 90 percent in 2019. Using our own estimates from surveys in 27 of India’s highest-need districts, we find that even in poorer areas of the country, mobile phone ownership is between 70–90 percent (varying by region).
Knowledge gaps in the quality and management of phone surveys in the Indian context still exist. Our team has identified the following set of questions to guide our research and may add further questions as they arise:
1. Best practices for phone survey execution
1.1 How do we collect phone numbers with high accuracy?
1.2 How does geography/context affect our ability to reach respondents over the phone?
1.3 How effective is an appointment system? What is the most efficient way to make appointments?
1.4 What is the best time of the day to call households?
1.5 How can we increase pick-up and response rates?
1.6 What is the optimum duration for a phone based survey?
1.7 What is the most efficient call-back procedure?
1.8 Is an in-person enrollment round required for phone surveys?
1.9 What are the best incentives for households to complete a phone survey/enroll in a panel?
1.10 What are best practices in surveyor management (assignments, incentives, monitoring)?
1.11 What are best practices in remotely training surveyors for a phone survey?
1.12 What are the systems and softwares that are crucial for executing phone surveys seamlessly?
2. Phone survey feasibility
2.1 How can we reach different respondents within the household over the phone?
2.2 What are the types of questions/sectors that we can feasibly ask about over the phone? What cannot be asked?
3. Data quality of phone surveys
3.1 What is the bias incurred due to not surveying respondents who don’t have phones?
3.2 What are the similarities and differences in data collected between in-person and phone surveys?
Non response bias
3.3 How does the non-response bias that is incurred compare between phone and in-person surveys?
Our research is divided into two phases. In the first phase, we have been conducting experimental pilots to understand how to run and manage phone surveys. Once in-person surveys are viable, we plan on testing the quality of the data provided by phone surveys by comparing them to the responses from in-person surveys.
We will share some of our experiments and our main findings in a series of blog posts over the next few months, and we hope this information will be helpful for organizations looking to implement their own phone surveys. We also hope to create a discussion, criticism, and feedback forum as we work to make our research as beneficial as possible.
Below are the links of posts we have published so far:
If you’re experimenting with phone surveys or have feedback about our processes, please comment below or on social media, or reach out to our team, abhilash.biswas[@]idinsight.org and mitali.mathur[@]idinsight.org, with your thoughts and suggestions.