Data-sharing in government: why it’s time for a new social contract

The narrative around “data-sharing” in government needs resetting.

It is too often short-hand for the unaccountable duplication and joining together of data to meet short-term policy needs or reduce costs. Policymakers need to answer the following question: as more data is used to deliver more services, what will it do to people’s mental model of government and will it be one that people trust?

Failure to answer that question, or to anticipate the unintended consequences arising from the opaque use of data, risks missing the true opportunity of digital government. It may also be a boon to those looking to actively decrease trust in government or suppress civic participation through disinformation campaigns.

Social Credit

To explain why, we first need to look at China and the development of the concept of “social credit”.

Contrary to many reports, China’s national-level Social Credit System does not assign people an individual score, or use advanced algorithms to decide people’s fates (at least not yet). It is better understood as a complex collection of legal agreements and simple lists joined together with people’s unique ID numbers. In their review of the system, Yu-Jie Chen, Ching-Fu Lin and Han-Wei Liu describe an ever-growing web of “memorandums of understanding” — data sharing agreements — between government agencies and a series of “red-lists” and “black-lists”.¹

The lists contain the types of data that many countries would hold about their citizens, things like court fines, timeliness of tax payments and breaches of business regulations. Reading Oxford academic Rogier Creemers’ English language transcript of the Chinese government’s announcements on Social Credit,² one of the things that comes through is that, in part, China is building the basic building blocks of a justice, finance and business system because previously they didn’t exist.

This simplicity makes the system none the less concerning. Social Credit is designed around the concept of “trust-breaking here, restrictions everywhere”,³ under which an infraction in one part of the system, — like failure to pay a parking fine, pops up in another part of their life, like trying to apply for a job. The data-sharing agreements enforce this concept, the breadth of which have been recently documented by the China Law Translate project.⁴

The result of these registers and data-sharing agreements is a changing, arbitrary and unknowable set of punishments. For a given action, it is impossible for someone to know what the outcomes will be and in what aspect of their life.

Fear of data-sharing: examples from the USA, UK and India

From the point of view of a democratic country, it is easy to dismiss this as “something that couldn’t happen here”, but could a system take on some of these characteristics through the accumulation of small political choices to join datasets together? As more data is collected about more people for more purposes, and as a government is perceived to be more competent in its use of that data, could anysubstantially complex and opaque system risk people changing their behavior due to perceived risk, even where one does not exist? What effect might the backdrop of questionable data practices in the private sector and a general mis-trust of government have?

A few recent examples suggest these are questions worth considering.

In March 2018, City Lab reported on findings from the pre-test phase of the 2020 US census, where researchers had found “an unprecedented groundswell in confidentiality and data sharing concerns, particularly among immigrants or those who live with immigrants.” They also found that respondents were providing false names and dates of birth in response to those fears.⁵ This was before the US Department of Justice confirmed that the 2020 census would include a question on citizenship. It illustrates the risks arising from fear that data collected for one purpose will be used for other purposes.

On the other side of the Atlantic, in November the UK National Health Service finally pulled out of a controversial “memorandum of understanding” with the Home Office. It had granted Home Office immigration officers access to data about patients to help them trace people breaking immigration rules.⁶ The withdrawal followed objections from Public Health England and the House of Commons Health and Social Care Committee that people not seeking medical attention, or deliberately missing vaccinations for fear of data about them being shared, posed a risk to public health⁷ ⁸. This, in practice, represented the UK government prioritizing short-term immigration targets over the risk of disease outbreaks, and had happened with limited legislative oversight.

In India, the Aadhaar identity number was created to make welfare distribution run more smoothly. A mix of pure utility and government policy is seeing it spread into more and more aspects of Indian public life including property registration, cycle hire, digital wallets, tax returns and facial recognition systems and has seen the rise of a vocal civil society response highlighting the risks to privacy.⁹ ¹⁰ ¹¹ ¹² Some Indians have reverted to paying their taxes via paper forms (which does not require an Aadhaar number) rather than use the online service (which does).¹³ Even if the numbers are small, it illustrates a problem digital transformation programs face if people don’t feel they can trust how the service will use data about them.

It is part of a healthy democracy that people can trust the services they rely on, and that they have the agency to fix things when they go wrong. These examples start to show some of the risks that democracies are facing in trying to live up to that ideal.

Towards a new social contract on data

So, what might need to be in place to build and maintain that public trust?

Instead of encouraging more “data-sharing”, the focus should be the cultivation of “data infrastructure”,¹⁴ maintained for the public good by institutions with clear responsibilities and lines of accountability.

Between institutions, there needs to be a clear understanding of the types of data that carry significant enough risks to require permission from elected representatives before they can be combined. This, in turn, requires a minimum level of data literacy amongst politicians and shows the importance of initiatives like TechCongress.¹⁵

Countries also need to consider how their identity ecosystem helps people keep certain aspects of their lives separate from each other. This probably means something with more plurality than a single, centralized system like Aadhaar being used for everything (something the recent court ruling may have secured the Indian people).¹⁶

Policies also need to be designed to be less data-hungry in the first place. If data isn’t collected, then it can’t leak and can’t be misused. In time, this could also become an argument for less means-tested and more universal service provision, because means-testing comes at a cost to a user’s time and privacy.¹⁷ This will mean going beyond “data minimization” as currently understood¹⁸ and thinking about different ways of approaching a policy goal altogether.

Policymakers also need half an eye on their impact on people’s overall sense of fairness. As Donald Moynihan, Pamela Herd and Hope Harvey note in their paper on “administrative burden”:

Individuals care as much or more about the process of their interactions with the state as they do about the outcome. This implies that procedures perceived as consistent, fair, and equitable are fundamentally important to citizens¹⁹

Governments will also need to develop ways to involve the public and their representatives in decisions about data. Being clear and open about what data is collected and what purpose it is used for, will be critical for public trust. For that, we can look to existing approaches to public engagement, for example how people have a say in their built environment. But we should also seek novel ways to deliver transparency and recourse at the point of use of digital services,²⁰ ²¹ ²² and to new legal-technical constructs like data trusts.²³

Finally, there needs to be more space for technologists to work with government to operationalize ways of limiting how data can be used, in ways that can be trusted and verified by the pubic and their representatives. Given there are few examples to point to from the private sector, this may be one area where government needs to lead the way.

Start to get some of these things in place, while also keeping a ruthless focus on raising the quality of the design of digital services, and we could have the foundations of a social contract between people and their governments that is fit for the digital age.

Thanks to David Eaves and Ben McGuire for feedback on this blog post which was edited by Eva Weber.

  1. Chen, Yu-Jie and Lin, Ching-Fu and Liu, Han-Wei, “‘Rule of Trust’: The Power and Perils of China’s Social Credit Megaproject”, Columbia Journal of Asian Law, Vol. 32, №1, 2018, 30th April 2018,
  2. Roger Creemers, “Planning Outline for the Construction of a Social Credit System (2014–2020)”, China Copyright and Media, 14th June 2014,
  3. Chen, Yu-Jie and Lin, Ching-Fu and Liu, Han-Wei, “‘Rule of Trust’: The Power and Perils of China’s Social Credit Megaproject”, Columbia Journal of Asian Law, Vol. 32, №1, 2018, 30th April 2018,
  4. “Social Credit MOU breakdown (BETA)”,
  5. Kriston Capps, “Census Report Found ‘Unprecedented’ Fears About Privacy Last Year”, City Lab, 29th March 2018,
  6. Jasmin Gray, “NHS Pulls Out Of Data-Sharing Deal With Home Office Immigration Enforcers”, Huffington Post, 12th November 2018,
  7. Denis Campbell, “NHS will no longer have to share immigrants’ data with Home Office”, Guardian, 9th May 2018,
  8. Alan Travis, “NHS chiefs urged to stop giving patient data to immigration officials”, Guardian, 31st January 2018,
  9. Nisha Nambair, “Got Aadhaar? No need to scout for witnesses to register property”, The Times of India, 8th February 2019,
  10. Express News Service, “Aadhaar card for Mo Cycle”, The New Indian Express, 28th January 2019
  11. Prabhjote Gill, “This is why Amazon wants your Aadhaar number”, business Insider India, 13th January 2019
  12. “Digi Yatra- A New Digital Experience for Air Travellers”, — national Portal of India
  13. Namita Devidayal & Lubna Kably, “Aadhaar rebels find ways to avoid PAN linkage”, The Times of India, 23rd July 2017
  14. “Principles for strengthening our data infrastructure”, The Open Data Institute blog, 31st August 2016,
  15. “TechCongress: A Congressional Innovation Fellowship”,
  16. Vidhi Doshi, “India’s top court upholds world’s largest biometric ID program, within limits”, The Washington Post, 26th September 2018,
  17. Richard Pope, “10 rules for distributed / networked / platformed government”, Writing by Richard Pope, 12th November 2015
  19. Moynihan, D., Herd, P., & Harvey, H., “Administrative burden: Learning, psychological, and compliance costs in citizen-state interactions”, Journal of Public Administration Research and Theory, 25(1), 27th February 2014
  20. Richard Pope, “Democracy at the point of use?”, Writing by Richard Pope, 23rd January 2015,
  21. Richard Pope, “Permissions.Understood.”, Writing by Richard Pope, 27th February 2015,
  22. “Data Permissions Catalogue – IF: An evolving collection of design patterns for sharing data”,
  23. Jack Hardinges, “What is a data trust?”, Open Data Institute blog, 10th July 2018,