The Future of Education Surveys

In a system with limited feedback options, education needs better ways to learn what’s working.

Trevor Selby
6 min readOct 24, 2016

What you need to know / TL;DR

  • If the survey isn’t mobile enabled, you’re aggravating educators
  • Surveys are always a tax on educators’ time. This tax can be lowered by good design and compensation.
  • Future education survey platforms will be mobile, beautiful, and take advantage of external data and advanced weighting algorithms to ask fewer people fewer questions. Other companies are doing this today.
  • Costs will shift from driving response rates to direct incentives.

Today’s Challenges ️

If you want to change the status quo in education, you need to understand what’s happening in the classroom. Teachers are at the heart of the education system. It is essential to understand what they need and gather feedback about what’s working and what isn’t.

Nevertheless listening to educators in a complex system is notoriously difficult. The traditional path many states, districts, philanthropies, nonprofits, and researchers pursue is to send out a survey. Administering a survey at scale takes a tremendous investment in time and resources throughout the survey lifecycle. From building a representative sample, to fielding valid questions, to recruiting and communicating with districts, to administering the survey and following up, to collecting, analyzing, visualizing, and sharing the data, to using the data to help inform decision making — many months and a significant dollar investment are required.

At the same time, educators dislike taking surveys. For one, they are asked to take numerous, lengthy surveys often with similar questions from different stakeholders. More importantly, they report not feeling empowered by the survey, often never seeing the results or understanding how those data were used. States and districts are getting better about using data for feedback purposes and groups like the Data Quality Campaign provide resources for building systems that support effective data use at all levels of the education system. Nevertheless, survey participation rates are often low. Among seven states issuing the Teaching, Empowering, Leading, and Learning survey, the combined overall response rate was 35% (with an impressive 89% from Kentucky). That’s half a million teachers whose voices are not being heard.

It’s not just education that faces this issue. “The future of survey’s as a reliable means to measure trends is in doubt,” writes the National Science Foundation in a special report. Response rates have fallen by as much as 70% in telephone surveys due to the growth of cellphones and decline in people willing to answer surveys. The surprise over ‘Brexit’ was yet another high profile polling forecast failure.

These factors will continue to affect data gathering through traditional survey methods, requiring longer and longer survey administration windows and more intensive follow up and communication. As a result, costs will rise and the timeliness of survey data fall.

Changing Incentives

I recently completed a project that was inspired by this challenge: can the survey experience itself create incentives for participation? Even small increases in survey response rates potentially save thousands of dollars.

To test out the idea that a high quality survey would lead to higher response rates, we partnered with Persicopic and Impaq International, to combine human centered design with an educator survey and tested it with a group of teachers in Philadelphia.

When building the prototype, one of the things we heard from teachers is that they often take a survey and then never see the data again. We hypothesized that allowing teachers to see their answers in comparison to everyone else immediately at the end of the survey would create another incentive to finish. It was a little like the “see how you compare” surveys online but with detailed feedback and context. In order to validate our test, we also created a “standard” version of a survey tool and then designed a random assignment study to a/b test the two versions.

Sample Survey Results Page for Pilot Project

What We Learned

Teachers responded in equal rates to both surveys. In other words, trying to increase participation with a “data reward” had no effect. In fact, we learned that it might actually reduce enthusiasm. During our in-person focus group discussion we learned why — it was an extra screen. An extra screen meant more time out of that teacher’s day.

The implications of this are clear. A survey is always a tax on educator’s time. Our focus group participants reported an appreciation for the design and responsiveness of a modern survey tool but it wasn’t enough to overcome this tax. From their point of view, surveys are a compliance exercise and the tax can be lowered through incentives and design, but not eliminated.

Innovation Driving New Possibilities

Fundamentally, surveys are fielded to gather valuable information. Yet they impose a tax on respondent’s time. Until genuine value is exchanged, this tax will exceed an individual’s incentive for participation. Incentive in an educational context means more than just money. It can include group or individual prizes, pressure from peers and administrators, or a desire to help improve the system.

While the state of the art today is either SurveyMonkey, a research firm, or new entrants like Panorama Education, mobile computing offers new ways to drive efficiency in survey data collection and compensate users for their time. A full 40% of our survey respondents attempted the survey from a mobile device. In 2016, a mobile compatible survey ought to be the first platform of choice, not an option.

A mobile survey application also has the potential to be much smarter about fielding surveys. Teachers could set the times of day they are busy to avoid interruption. A mobile application could more easily stratify survey questions across a group to lower the number of questions asked. Over time, a cohort of teachers answering shorter surveys would produce valuable covariate data to again reduce the number of questions asked. The algorithms and methods for collecting and combining the statistical analysis could be automated.

In this way, costs will shift from an infrastructure built to drive response rates to participation incentives. Once more, these incentives could be instantly credited to the user’s mobile device. Imagine a teacher getting an alert on her phone at 3:45 in the afternoon, “Would you like to take a seven question survey in exchange for $2 Starbucks credit?” Multiply that by several thousand teachers, and short cycle data collection starts to become a reality both logistically and financially.

Certainly there are challenges with this approach, but it’s not an entirely new one. Companies, particularly in political polling, are already pursuing these models.

Civis Analytics has developed a sophisticated database of voters in order to model and predict election and policy outcomes. For example, when working with Enroll America to target health care outreach they called 10,000 people and asked just one question! The results of that one question were used to compare to the hundreds of data points in their database. This allowed Civis to figure out which variables were likely predictors of health care coverage for every one of the 180 million adults in the US under the age of 65. When those variables were identified, modeled, and tested, Civis built zip-code based maps for Enroll America to target.

Qriously uses a network of mobile applications to deliver surveys. In order to get a representative sample, they apply a machine learning algorithm to predict the demographics of a particular user as well as include demographic questions to develop response weights.

Future educator surveys will likely take on components of these models:

  • Mobile first
  • Surveys delivered over existing teacher and educator networks with smart weighting algorithms
  • Longitudinal panels with micro credit

With over three million public school teachers in the United States and limited ways of getting feedback, I expect educator surveys will need to become much smarter in order to reduce cycle time, decrease cost, and engage users, ultimately leading to a better job of listening to educators.

--

--

Trevor Selby

Exploring intersections of innovation, data, machine learning, AI, #edtech, and education. Former @gatesed. Weak Rock Climber.