Data & Policy Blog
Published in

Data & Policy Blog

Official statistics: Language for public discourse (Part 1)

This is the first blog post of 2 contributions by Walter Radermacher, former Director-General of Eurostat and Chief Statistician of the European Union. Here, Walter introduces the current social malaise around statistics, their historical role in underpinning public policy and suggests fundamental questions about the future of the discipline. Part 2 will highlight the importance of adequate policy frameworks for official statistics.

In the Corona pandemic, it has become clear how important reliable statistics are for political debates and decisions. Since its onset, the crisis has been accompanied by an ‘infodemic’, a flood of data, some of varying quality, that confuses rather than informs the layperson. In the past, a crisis would have led to new information needs being directed towards official statistics as the preferred provider. This seems to have changed. On the one hand, reference is made to the opportunities presented by data revolution, data-sciences and learning algorithms (so-called AI) as an alternative to official statistics (which are perceived as too slow, too inflexible, and too expensive). On the other hand, after decades of austerity policies, official statistics find themselves in a similarly defensive situation as the health sector. There is a lack of financial reserves, personnel, competences and know-how for much-needed innovations.

In January 2017, six months after the Brexit referendum and at the beginning of Donald Trump’s presidency, William Davies published a widely acclaimed article “How statistics lost their power — and why we should fear what comes next”. In it, he expresses his concerns that nothing less than the end of a statistical era has arrived, with serious consequences for public discourse, trust in experts as well as politics, and with options for populist politicians to use this for their purposes. With ubiquitous amounts of data and almost infinite possibilities of use, informational ecosystems are fundamentally changing; statistical logic is being replaced by data logic.

With the authority of statistics waning, and nothing stepping into the public sphere to replace it, people can live in whatever imagined community they feel most aligned to and willing to believe in. Where statistics can be used to correct faulty claims about the economy or society or population, in an age of data analytics there are few mechanisms to prevent people from giving way to their instinctive reactions or emotional prejudices. [1]

The term ‘statistics’ has the same linguistic roots as ‘state’. Since the Enlightenment, statistics has been closely married to the nation state, to democracies of various kinds and, unfortunately, also to dictatorships. In the neo-liberal governmentality that has prevailed since Margaret Thatcher came to power, and at the latest since the fall of the Berlin Wall, official statistics find themselves in the paradoxical position that the appetite for facts for evidence-based decision-making is steadily increasing, while at the same time they have fallen into disrepute with all state actors[2]. When official statistics appear in the front ranks of political priority, it is mainly when it comes to reducing supposedly useless bureaucracy or saving money in the public sector.

For some years now, a resistance to evidence-based governance has been growing; scepticism towards all forms of experts does not stop at scientists or statisticians. Coupled with a lack of statistical literacy and the impression of being at the mercy of the representatives of a supposed technocratic regime, the counter-position is forming in which the existence of neutral facts is negated or relativised. The reduction of social and economic questions to numerical aggregates and averages no longer seems acceptable, unless the results come from one’s own calculations and correspond to the ‘truths’ that demagogues deliver about what is going on in society.

After a year in a state of pandemic emergency, it is time to return to the discussion initiated by William Davies. Obviously, there are still and again government services that seemed to have disappeared from our radar screen. These include — along with public health care — the provision of statistical information of sound quality, comprehensibility, and trustworthiness. It is necessary to ask the fundamental question, as we did after the fall of the Berlin Wall, whether we need official statistics as the backbone of democratic decision-making, and if so, what their tasks are and how they should be financed and anchored in the political system.

End of Part 1. Part 2 is available here.

[1] W. Davies, How statistics lost their power

[2] See for example “ASA, COPAFS, Partners Urge Bolstering of Federal Statistical Agencies”

This is the blog for Data & Policy, the partner journal for the Data for Policy conference. You can also find us on Twitter. Here’s instructions for submitting an article to the journal.




This is the blog for Data & Policy (, an open access journal for the impact of data science on governance. Editors-in-Chief: Zeynep Engin (UCL, Data for Policy), Jon Crowcroft (Cambridge, Turing Institute), Stefaan Verhulst (GovLab, NYU). Published by CUP.

Recommended from Medium

Want to learn Data Science? Check out 365DataScience | My Personal Review

Starbucks Recommendation Engine

Building Check Weather SG

Data Warehousing Schemas

MetaOmGraph: An interactive data analysis and visualization tool for large expression datasets

Trademate Sports Betting Results March 2021

Bootstrapping and bagging 101

Anomaly detection with Local Outlier Factor (LOF)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Data & Policy Blog

Data & Policy Blog

Blog for Data & Policy, an open access journal at CUP ( Eds: Stefaan Verhulst (GovLab), Zeynep Engin (Turing) and Jon Crowcroft (Cambridge).

More from Medium

Massively Scalable Geographic Graph Analytics Using InfiniteGraph and Uber’s H3 Hexagonal…

A center hexagon (degree 0) surrounded by a layer of hexagons at degree = 1 and another layer of hexagons at degree = 2.

Song of the Machines (1): Sampling musical sections — by Dorian Van den Heede

What is Media Mix Modeling?

7 Questions You Should Ask Before Building a Churn Prediction model