What next for Citizen-Led Security Standards?

Lauren Coulman
Noisy Cricket
Published in
6 min read6 days ago
Photo by Dennis Kummer on Unsplash

Four years ago, we started a journey to establish Citizen-Led Security Standards. It was 2020, and at Noisy Cricket we were a couple of years into exploring and establishing the Responsible Tech Collective. With a vision to bring home the humanity to tech, our cross-sector and in-community collective of individuals and organisations were exploring opportunities to ensure tech was more equitable, inclusive and sustainable for the people of Greater Manchester and beyond.

Citizen-Led Security Standards Foundations

With seed funding available through the Co-op Foundation and Luminate, collectively we were scoping out which tech ethics issues were pressing on society. In Greater Manchester, security was fast becoming a hot topic. GCHQ moved in, the North West Partnership for Security and Trust set up stall and the Digital Security Hub was established. Yet, while the importance of protecting data — the currency of the digital age — from bad actors gained traction, who was considering the impact of data security on the people providing the data, in communities and society at large?

Alongside projects focused on ethnic equality in tech and people-powered smart cities, a partnership formed around Citizen-Led Security Standards, with Greater Manchester Combined Authority (GMCA), The University of Manchester and Open Data Manchester. Spearheaded by Noisy Cricket and supported by Honey Badger Clan, initial desk-based research revealed that people’s data vulnerability and trust in organisations were most likely the root cause of the issue. Then, we opened up to communities.

Surveying almost 200 Greater Manchester residents, with a focus on people from communities across Greater Manchester. We learned that data sharing and misuse within organisations was of far greater concern than the hacking being focused on by academia, government and business. Further workshops, targeted at families supported by local government, housing associations and local charities, saw 15 people share their concerns and hopes for data-sharing practices.

Citizen-Led Security Standards Focus

Focused on the context of data sharing within local government, over the course of the 18 month project, we also worked with GMCA and the ten local authorities across the region to explore the internal challenges and opportunities to building trust through data sharing. Despite public sector trust being at an all-time low — including only 60% of people distrusting councils- the Digital Economy Act (DEA) 2017 allows the government to share people’s data if it’s in their best interest. The question is, who defines “best interest”?

So, we asked our community participants questions about what was ok and not ok when it came to data sharing, and how councils were navigating doing so. We learned a lot, starting with the structural challenges at play. With GDPR regulation and DEA legislation enabling local government to share data without explicit permission from communities, compliance and efficiency were the key considerations. While awareness-raising is built into the principles surrounding both policies, action taken is limited.

While it’s easy to make assumptions about government complacency or indifference, we found that limited resources and resilience — in the wake of decade-long budget cuts — were the key factors influencing organisational ability to move beyond compliance into practising data ethics. Undertaking community engagement was also hindered by the lack of skills and expertise to move beyond transactional consultation practices and truly understand the wants and needs of the people they serve.

Citizen-Led Security Standards Insights

Communities emphasised their desire to be clear on the choices and opportunities being made available to them as a result of data sharing, the lack of which often results in complacency. Yet, the need for convenience in light of having to repeatedly share stories across multiple support agencies — and having the government step in when in crisis and/or an individual’s capacity is limited — highlighted that government data sharing is very welcome in some circumstances.

So, what are the circumstances? One of the primary levers to assess how much personal risk and reward might be at play when data is shared is the relationship between the individual and organisation. The more human interaction involved in the relationship, with the NHS for example, the more likely trust will be at play, and the greater clarity on the purpose of the data being shared, the better. Yet, understanding the nature of the data being shared — and how personal, sensitive or behavioural it is — is also key.

There’s also the context in which the data is shared onward to consider. A person might trust the organisation the data was originally shared with, but if it’s passed on to the police or social services — especially digitally — then the potential risk or reward of sharing shifts significantly. Strength and dependence of need play a part too, as most government services don’t come with an alternative option for provision, to how likely an organisation is to help or hinder community trust.

Citizen-Led Security Standards Intention

Trust is important. The people that any organisation exists to serve needs to know that the they are acting in their best interest. Without trust, communities won’t share the data, and if organisations don’t have the data with which to inform decisions, design services or deliver support, then people disengage. This creates a downward spiral, where social issues and challenges go unaddressed, become entrenched and the cost (to government in this instance) of reversing the tide increases exponentially.

Trust is everything in delivering effective and impactful services, and it starts with understanding what people really want and need. So, it became clear that in addition to managing shared data to mitigate risk and maximise reward, how people are engaged and communicated with matters too. At its most simple, people need to understand what is being shared, why, and with whom. Far more challenging is the necessity of human interaction to avoid misinformation and misunderstanding and ensure shared value.

So, we’re now resuming our journey to determine how. Partnering with GMCA — in alignment with the ethically and trust-focused Information Governance Strategy and its mission to use data more responsibly — means first mapping data sharing around its pioneering Changing Futures programme and where risk and reward are indicated to inform where trust building is needed. Championed relentlessly by Stephen Girling (Change Program Manager), initiated with Phillipa Nazari and enabled by John Curtis-Laurence (current Assistant Director Information and Data Governance and Data Protection Officer), work is underway.

Citizen-Led Security Standards Intention

In Phase 2 of the Citizen-Led Security Standards project, we’ll be working with across GMCA, Manchester City Council, Rochdale Borough Council and VSCE organisations (including Shelter) to determine not just data flows, but the systemic challenges and opportunities which help and hinder the building of community trust, With the intention of having a better understanding of the needs of people experiencing entrenched poverty, we’ll be supporting wider efforts to move people beyond the cycle of deprivation.

With funding pending for Phase 3, when we’ll take what we learn out to communities to determine how to assess risk and reward and best communicate to build trust, the long-term ambition is to shape a solution which enables local government to better serve its resident’s interest while navigating barriers to doing so. With systems change at its core, the Citizen-Led Security Standards project will be years in the “finishing” but we’ll keep you posted along the way.

--

--

Lauren Coulman
Noisy Cricket

Social entrepreneur, body positive campaigner, noisy feminist, issues writer & digital obsessive. (She / Her)