What are we talking about when we talk about government data?

In 2010, then CEO of Google Eric Schmidt famously claimed that from the dawn of civilization until 2003, humans created roughly 5 exabytes of information. That amount, he suggested, “is now created every 2 days, and the pace is increasing”. Today, IBM defines big data in a similar fashion: “Every day, we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today as been created in the last two years alone.” These are numbers designed to inspire awe and wonder. They transform what is historically speaking a rather dull entity, data, into something approaching the sublime.

Such numbers are routinely challenged, of course, with one colourful commentator calling Schmidt’s figure “a load of crap”, but there is little doubt data production has significantly increased in recent years and data are increasingly placed at the heart of social transformation. Public administration is no exception. Indeed, with the global and national economy in bad shape, rising debt, the National Health Service in a state of emergency, unaffordable housing, melting polar ice caps, war, terror and resulting migration, it is increasingly difficult for governments to put a positive spin on anything. Data is one of the few exceptions.

In the recently published (UK) Government Transformation Strategy, for example, data is positioned at the center of everything: “Data is driving fundamental changes in our daily lives and in the economy”, the report claims, and “Data is a critical resource for enabling more efficient, effective government and public services that respond to users’ needs. It is the foundation upon which everything else rests.” Readers are spared speculations on data’s sublime quantities, at least, but it’s clear transformation in government will now be data-driven transformation. Data is somehow in the drivers’ seat, “driving fundamental changes” and simultaneously the “foundation” for everything else — it is the road, or perhaps the vehicle?

Source: Government Transformation Strategy

One of the main ways government presents the transformative power of data is in terms of transparency. Back in 2011, for example, the Cameron government launched its Transparency Agenda in the aftermath of the ‘expenses scandal’. The agenda, largely overseen by Francis Maude, came to focus almost exclusively on opening up government data. The data.gov.uk. website is a product of this agenda.

Source: Data.gov.uk

Conversely, Maude’s influential Open Data White Paper (2012) is bookended with discussions of “Building a transparent society”. Another example is lies with the Digital Accountability and Transparency Act of 2014 in the US. This law requires the establishment of data standards on spending, as well as the routine publishing of standardized spending information. The act is commonly referred to as the DATA Act.

This tight coupling of data with transparency serves those interested in pursuing data driven initiatives, but there is increasing recognition that the use of data in government exceeds those related to advancing transparency (and leaving aside the surveillance and ‘signal intelligence’ work of security agencies). The recent Government Transformation Strategy states as one of its main objectives, to “make better use of data — not just for transparency — but to enable transformation across government and the private sector” (7). According to the Strategy, this transformation will be achieved through opening up more government data; removing barriers to data use; appointing a Chief Data Officer and a Data Advisory Board; using data to improve decision-making and analysis; using data securely and appropriately; building infrastructure and new discovery tools; and transforming how data are stored and managed. With the exception, perhaps, of improved decision-making and analysis, it is difficult to see exactly how these changes will be transformative; that is, how data will be actually used to transform how government goes about its business. Appointing an Officer, releasing more, better storage — these are fairly incremental developments that tell us little about how data will be used.

How are data used in government? There is no straightforward answer, but much can be learned from considering just a few places where data are used and become visible within the routines of public administration.

If data are to be made handy for anyone but a data scientist, for example, they need to be presented a certain way. They also need to speak to well-established concerns that have obvious relevance to the user. The most common way these needs are satisfied is through a specific representational format, namely, a dashboard. Like the car dashboards on which they are based, information or data dashboards bring together a number of measures and indicators, typically into a single screen and for easy comparison and quick analysis. While any number of visual techniques are available, dashboards overwhelmingly rely on well-established visualisation techniques, such as line and bar charts, gauges, maps, and pie charts. A typical dashboard contains a number of these visual elements on the same screen.

Data dashboards are used all across government. Departments use them to manage human resources, or to manage budgets. Many city councils have their own city dashboards. One local authority has a ‘missed bin’ collection dashboard; another has a ‘clean and green’ street cleanliness dashboard.

Source: Croydon Council Clean and Green Dashboard

Over the last few years, the Government Digital Services team within the Cabinet Office has been creating a dashboard for every public service offered by the government. Currently, there are over 800 government service dashboards. One way to understand how government is making use of data to enact transformation is to look more closely at these dashboards.

The earliest management dashboards can be traced back to the 1950s, where they were first used as part of company reports. The data brought together in these earlier dashboards was distinct in that they displayed more than merely financial data — commonly including human resource data and operational data, for example — and because they were not equated with more official company numbers, such as those found in annual reports or submitted to authorities. Dashboard data was more comprehensive and more up to date, but less authoritative. At this stage, the dashboards existed in paper form only.

The 1960s saw the emergence of Decision Support Systems, computer systems designed to help managers make business related decisions. These systems, and the academic field of study that grew along with them, was deeply influenced by the work of Nobel prize winner, Herbert Simon. Much of Simon’s work revolved around rethinking decision-making. He criticised the idea that people were able to act in fully rational ways and instead stressed the ‘bounded’ nature of rationality. In his study of organisations, he pointed out that managers regularly make decisions with limited information, as it is not always possible or practical to gather all the necessary facts to make a rationale choice. Instead, managers tend to work with heuristic methods and the best one can hope for is a ‘good enough’ decision (which he described as ‘satisficing’). In an early book on management and decision-making, Simon also stressed that decisions are not arrived at spontaneously by managers and must instead be understood as processes, which can be broken down into distinct stages.

Source: DRS 2016

The field of decision support systems aimed to improve organisation decision-making capacities by using data and computation to augment and automate different stages of the decision process. Managers accessed this data through visual displays that would come to resemble the dashboards found in (paper) company reports.

Research on dashboards suggests they are used across all levels of an organisation, and that different dashboards can be tailed to different needs. So-called operational dashboards are deployed on the lower levels, while executives make use of ‘strategic dashboards’. In all situations, they are used to monitor performance, though what is monitored (a machine, a department, the organisation, an economy) and how, varies. More recent dashboards often contain increased analytical capacities, so users can ‘drill down’ or generate new data queries.

In terms of users, dashboards offer the promise of control. They position users ‘above’ a number of intersecting flows of data, data which has already been stripped back, filtered and formatted to reveal only the most relevant insights. Dashboard data does not produce ‘data scientists’ but rather ‘drivers’ — people who use data to do something else.

A focus on dashboards offers a different angle on the increased use of data in government, one where management decision making around business performance is key. Let’s return to the Government Digital Services (GDS) team — a team at the heart of government transformation. Largely through GOV.UK, the GDS team is attempting to recreate government as a platform.

UK Government as a Platform

The ‘platform’ is a one stop shop for all materials published by the government and all of the services offered to its public. By standardising and aggregating all services onto a single platform, the government can collect standardised data for all services. When a service is fully digital or “digital by default”, a significant amount of data can be collected on each service. How many users? Completion rates? Which device? How long on each page? Which pages to people get stuck on? And so on. This is a practical example of increased data production in government.

Part of bringing all government services onto the GOV.UK platform involves the requirement of departments to publish performance data about each service. This data is what comprises the 800-odd service dashboards mentioned earlier. The dashboards are all accessible via what was initially called the ‘performance platform’ and now just ‘performance’. According to the performance landing page, services are measured in terms of four key performance indicators (KPIs): cost per transaction, completion rate, user satisfaction and digital take-up. However, even a cursory look at the performance dashboards shows these KPIs are not being routinely used. Of the top ten most used digital services, for example, only one (Vehicle Tax Renewals) presents data on all four KPIs. Some services display only a few indicators (with transactions per year, total cost, cost per transactions and digital take-up being common), with others offering much more.

Government Digital Service KPIs

Through looking at service dashboards, a clearer picture of data-driven transformation emerges. Whatever the perceived capacities of data, it is likely data will be worked into existing managerial functions; data will be aggregated and squashed into key performance indicators and formatted into easily digestible visualisation techniques, many of which are centuries old.

GDS Performance Dashboard for Stamp Duty Reserve Tax transactions

The implementation of data-driven change will be ad-hoc and uneven, the results will often underwhelm. The awe-inspiring exabyte flows of information described by Eric Schmidt will likely be navigated through the blinkers of performance management. But this doesn’t mean change isn’t coming. Our increased capacity to monitor, measure and analyse will further cultivate a culture of performance management. This emphasis on performance is being radically extended and intensified, producing new possibilities and even new modes of attention and awareness. Once performance data is available for all services, for example, these services can be compared, grouped or contrasted; they can be benchmarked or studied in aggregate for recurring patterns and anomalies.

Full screen view (right) on display at Sprint 15 conference. Source: gdsteam CC by 2.0

It’s worth ending on a final feature of these service dashboards. Each comes with a special “full screen mode”, where the dashboard view is transformed into a sequential flow of single indicators. Each indicator hovers for a few seconds before being replaced by another number. Full screen mode is not meant for monitoring or analysis. It is for public display, in department entrances, offices or perhaps at the GDS headquarters. These displays are designed to transform the work environment, to project an atmosphere of measurement, an ambience of performance. The scrolling of indicators gives these displays a rhythm and sense of motion befitting the realtime sensitivity and sensibility of our imagined data-driven future. When we talk about government data, we need to talk about these new techniques of performance management.