Was Brexit a Polling Miss? Worse than That?

After the fateful Brexit tally was in, Carl Bialik of FiveThirtyEight sent me some excellent questions about the just-completed contest. With his permission, I’m publishing a lightly edited version of the exchange.

Q: How bad of a polling miss was this? How does it compare to, say, the Scotland referendum in 2014 or 2015 general election for pollsters overall?

A: This is much more than a polling miss. 50–50 elections are polling nightmares, and this was a close contest from the outset. The betting markets were horribly wrong, as was most punditry. So the failure was more across the board, which makes it worse than previous errant forecasts. Polls were almost the least wrong, but that’s paltry solace.

Q: How did SurveyMonkey in particular do? What did your numbers show?

A: On the head-to-head question that’s in focus today, we showed a deadlocked contest from June 8–20, but movement toward Remain on Tuesday June 21 and Wednesday June 22. That change was clear. It was either a real shift that was washed away in London thunderstorms, or by young voters who failed to turnout, or it wasn’t a real shift in opinion, but an emergent problem with the sample of people interviewed. If it was a sampling issue, it was one that affected almost every UK pollster, including the Election Day surveys.

One hypothesis that we’re digging into is around the effect of the assassination of pro-Remain Labour MP Jo Cox. Conventional wisdom (and our Tuesday and Wednesday data) had it that the killing boosted Remain in the closing days. What if, by contrast, the tragedy left some Leave supporters less willing to share their opinions with others, including pollsters? We’ll have more on this in the coming days.

Beyond the horse race, our polling revealed important undercurrents: far more Britons saw the EU benefiting the wealthy rather than the middle and working classes. Immigration was the dominant issue, and the prime motivator of the Leave campaign. The split between an unpopular Prime Minister and his own voters was also crucial. Our polls shined a light on these and other contextual trends — and we’ll be working to explain and share these in light of the tremendous interest in what happened here.

Q: What lessons can be learned for future UK polling?

A: One thing that’s crystal clear is that the answer isn’t as simple as pitting telephone polls vs. online ones, or those using probability samples vs. those employing non-probability mechanisms. It’s too early to give a thorough answer — we are going to examine all the results and come up with lessons for the future.

Q: Any reason to think that this polling underestimate of support for Leave means US public support for Donald Trump — who is against immigration and trade — is understated by polls? Or to think that the two have nothing to do with each other?

A: US polls, including ours, did not underestimate Trump in the primaries, so at a high level there’s no easy or worrisome connection. However, part of the UK failure that might jump the pond was that pundits and their quantified expression in betting markets systematically underestimated populist anger because they didn’t want to believe it. We all need to be vigilant against letting any such bias creep into our analysis.