How Might We Progress Citizen-Led Security Standards?

Lauren Coulman
Responsible Tech Collective
9 min readNov 28, 2022

When you think of tech security, what immediately comes to mind?

Cyber, perhaps? Hacking, and the potential for external attacks? Data misuse, by nefarious agents, spreading your data across the internet without your consent? Questions around how your personal, sensitive or behavioural information is being used, maybe influencing your democratic participation or influencing what you should buy?

All very human responses to the subject of security and how your data is shared. Yet, external agents are not the only cause for consideration when it comes to the sharing of our data. What about how the organisations you willingly hand your data over share it, and the potential uses it opens us up to?

Privacy regulation seeks to cover data sharing within an organisation or group, alongside the ICO’s work to uphold citizens’ information rights and encourage openness around data usage. Yet, exceptions — like the Digital Economy Act 2017 — provide a legal basis for the public sector in particular to share personal data where “clear public benefit” is indicated.

Aiming to utilise data across government departments, organisations and institutions to improve public services, housing data could be used to inform healthcare provision, or police data to inform social care services. In principle, there’s great potential here, to offer more targeted, integrated and effective services. Especially if ongoing privacy is ensured, and there is clarity and consistency — as intended by the Act — to how data is shared.

Yet, who determines what is to the public’s benefit? How comfortable do people feel that their data is being shared, and in what circumstances is it appropriate to do so?

Photo by Privecstasy on Unsplash

For government departments, the benefits of data sharing are clear. More insights on the people, families and communities in need of support could not only inform better interventions but also enable strategic prevention too, allowing government to support those they’re here to serve in more impactful ways. Public purse savings are possible too.

In an era when government budgets and resources are stretched beyond capacity, efficiency and effectiveness are paramount, but questions remain.

What needs do the public want and need to be met, and what does good look like to those experiencing challenges? What data is being shared by the government? How is the risk of derivative data — information used without the wider context of people’s lives being considered — being addressed to ensure decisions made about people’s lives are holistic?

Without the visibility of how data is being passed between council departments or between your GP and the police, the Digital Economy Act’s aim of providing consistency and clarity around data sharing is currently falling short, with consent and communication sorely needed to build the trust that government relies on to deliver much-needed outcomes.

The Digital Economy Act recommends codes of practice in its execution, including understanding community needs around data sharing and decision-making. Yet, endemic national issues surrounding trust in government require more concerted efforts than consultation can afford to bridge the gap between the public and sector and citizens.

The breach is most acute in relation to MPs and national government, with 76% and 73% of the population believing both groups are failing to make decisions that will benefit their lives. Yet the Carnegie Trust found that local authorities are also under scrutiny, with 60% of the public questioning how trustworthy their councils are in delivering good services.

While citizens often have little choice over providing data to access public services, it can impact how willing they are to engage effectively. The Digital Economy Act, in furthering the distance between decision-making and the daily reality of people’s lives, has the potential to exacerbate the issues that influence people’s trust in government.

So, how might it be leveraged to reduce the risk of data misuse and benefit both the public and the public sector equally?

Photo by Claudio Schwarz on Unsplash

At the Responsible Tech Collective (RTC), we brought together a group of cross-sector and interdisciplinary partners to explore how we might reduce citizen’s data vulnerability by building trust between citizens and the government departments, organisations and institutions that serve them.

Spearheaded by systems change and design agencies, Noisy Cricket and Honeybadger, the project team comprised changemakers across Greater Manchester Combined Authority’s (GMCA) Information Governance team and the University of Manchester’s Centre for Digital Trust and Society.

At a time when government budgets and resources are under strain, the need to more effectively address the region-wide poverty in Greater Manchester — where in parts two in three children live in poverty — is urgent. Hence the Data Accelerator programme, where data sharing opportunities are being mapped to help the region’s councils, the NHS and other government agencies expedite and enhance efforts.

Combined with GMCA’s pioneering and co-created strategy to centre ethics in its approach to information governance, and its mission to “foster trust between the people, communities, and businesses of Greater Manchester through greater transparency”, the strategy has framed a unique opportunity to explore what good data sharing looks like with (instead of for) communities.

To understand people’s parameters around data sharing, their needs and the challenges and opportunities inherent in building trust, Honeybadger surveyed over 200 people from diverse communities across Greater Manchester. Using real-life scenarios to indicate how people’s personal data might be captured, shared and used, the team also ran workshops with three groups of people from marginalised communities and vulnerable families.

Working with GMCA’s front-line teams, housing association One Manchester and Oldham-based charity Inspire Women, the team at Honeybadger explored what data sharing might look like in the context of community and government control of data. Here’s what we learned.

Photo by Kyle Glenn on Unsplash

Understanding Need and Offering Choice

Complacency — driven by the need for convenience, access to products and entertainment — with the public sector and private sector inertia impacts the perceptions of how data is shared versus the reality of how it’s being used.

While 50% of those surveyed did understand and accept that data was used to inform services and drive targeting — embedded in T&Cs or leveraged through wider tech networks — they weren’t happy about it and expressed interest in relationships with organisations varying based on different community needs.

Our communities told us they want to have a say in how data is shared, though in more structured and transparent ways that help them assess whether sharing data will meet their needs.

Through engaging with community organisations that people have active connections with and confidence, local government can reach and elicit a wider, more representative and inclusive understanding of how different groups’ needs can be met, in supportive, capacity-informed conditions too.

Clarifying Communications and Managing Expectations

Misuse, as opposed to security attacks, are of greater concern to communities, as often, people’s expectations of how data is used don’t match reality, leaving people feeling misinformed or misunderstood.

The fear of misuse varies depending on the organisation, with 50% of people surveyed feeling uncomfortable sharing information with the NHS vs 61% with the police. People’s willingness and ability to change behaviour, however, is limited, and if a person’s capacity is reduced — due to mental health or personal crises — they would like government to step in.

So, when it comes to trusting data, communities told us they wanted flexible and complicit consent, communicated in ways that enable them to determine what’s ok and not ok when it comes to decisions made about their lives.

The communities we worked with were quite clear that they believe government is responsible for ensuring security of their data — including its misuse — to ensure decisions are made in their best interest (rather than assumed). Communications around consent need to be clear, easy to understand and accessible too, so that the potential for misuse (and further impact on vulnerable communities) is minimised.

Photo by charlesdeluvio on Unsplash

Clarifying Risk vs Reward and Enabling Decisions

Looping back to whether data sharing enables people in communities needs are met and decisions made in their best interest, exchange of value is a key consideration, as people (complacently) tend to give far more than they receive due to data (mis)use. This creates a power imbalance, and once given, it’s often out of sight, out of mind.

Yet, as well as organisations and institutions, platforms influence how people feel about data sharing too. It’s very different to provide data in a real-life relationship with a GP vs remote engagement with police. Feelings about data shared are also influenced by whether they believe the data will be used or shared to directly improve services provided or used to profile them and or/ drive profit (dependent on sector too).

Communities would like to fully understand the potential of their data, so they can be clear on the cost or potential benefit of what they’re providing.

Communities are happy for government it make decisions if it makes their life easier — organisation and platform dependent — but also want to be empowered, beyond binary choices, with more nuanced information and options. The threat of services being denied if data isn’t provided also needs to be avoided.

So, what’s needed to make it happen? In the short-term — a period we anticipate taking up to two years — it will involve working with GMCA and the University of Manchester to effectively map out what data is shared. The intention is then to use that understanding to inform what engagement tools, data-sharing parameters and communication methods are required to build community trust.

This information will then be used to shape a data visulation tool, which will help both local authorities and communities make better sense of and inform decisions made around data sharing, enabling local government to inform community choice, manage expectations and enable decisions around data sharing.

This is part of a strategic ambition to turn the visualisations into a tool that directly effects decision making in the medium-term,, and long-term, inform consent policy around data sharing and usage, starting with the public sector. The reality is, much ground work is required to get us to the starting point, not least finding funding to underpin the skilled delivery partnership required to inform the approach.

Photo by Compare Fibre on Unsplash

Effectively driving systems change within local government, where moving towards a community-led approach requires both cultural and structural shifts within the organisations involved, means the runway for this projects is decades. As will all systems change, what will emerge in the fullness of time will be subject to the emergent, interdependent and dynamic needs and opportunities that emerge along the way.

Yet, in the coming year, the project team are clear that we’ll be delivering the following outputs

  • A data map exploring the choices, risk and rewards plus value in the context of community parameters for data sharing
  • Research on community needs, communication and decision-making in the context of data sharing
  • A data visualisation mapping the relationship, language and opportunities available to build trust between government and communities through data sharing

Inherent in the process of co-creating citizen-led security standards, we’ll be actively working to build better relationships between government and the communities we partner with, foster improved understanding of the over and lay the groundwork for more trusted decision making and delivery of services.

Now, to find the funding…

--

--

Lauren Coulman
Responsible Tech Collective

Social entrepreneur, body positive campaigner, noisy feminist, issues writer & digital obsessive. (She / Her)