Exploring Data Ethics

Lauren Coulman
Responsible Tech Collective
7 min readApr 21, 2022

Data is to tech what money is for economics, and how we capture, use and share it as organisations is fundamental to how we operate. Yet, in treating data as an ownable asset as opposed to what it is — personal, sensitive and behavioural information that belongs to the people who provide it — we forget that data is here to help us serve people. That, instead of the technology or organisations that wield it.

Photo by Alexander Sinn on Unsplash

At the Responsible Tech Collective, a group of values-led tech organisations and community representatives in Greater Manchester are working to bring home the humanity to tech. Data ethics is central to the practices that are being shaped.

Understanding that organisations that put people first and create better outcomes for the users, consumers and the wider communities that our digital outputs target and impact, the higher the likelihood we’ll create better outcomes for our organisations too. So, above and beyond data privacy and standard legislation, we’re exploring how everything from consent and security to open data and cooperative usage can create digital win-wins all around.

Conversations should be about services, people and outcomes and the appropriate digital solutions comes from that.

Darren Pegram @ GMCA

At our monthly Responsible Tech Review, we open up space to explore and expand our shared understanding of what’s possible when it comes to creating a more equitable, inclusive and sustainable tech industry. In late 2021, we brought together the most progressive and proactive thinkers and doers — working cross-sector and in communities — in the data ethics space to discuss how data can be used to build and sustain trust.

A person-centred approach was deemed key to overcoming the problems that emerge when digital products and services are built with assumptions and bias baked in through design, development and data infrastructure.

Photo by Claudio Schwarz on Unsplash

As the majority of the tech industry consists of white, heterosexual and able-bodied men, the needs of the rest of humanity can often go overlooked. A pertinent example is when Tinder found a security flaw allowed people to triangulate other user’s location, back in 2014. Gender diversity on the team would have highlighted the potential dangers of making such data available for their female users and helped mitigate the risk of losing trust in the dating app.

Security isn’t this ethereal thing, so how do we bring security into what we’re doing? The smaller companies are the innovative ones, doing the cool stuff, which they need to do while protecting their customers.

Jahmel Harris @ Digital Interruption

Understanding what’s needed and important is key to building better features and functionality. So, the insights we capture during the design and development process are just as important as the data we glean to deliver and measure the performance of digital products and services. Building this into our organisations — knowing what questions to ask of people and our data — is therefore essential.

There are groups of people who aren’t generating data or whose data isn’t being picked up. So when analysis is done, their viewpoints are not included.

Reina Yaidoo @ Bassa Jamba

Beyond how we choose to gather, use and share data, who we capture data on has a massive bearing on the digital products and services we create too. Often as organisations, we work with incomplete data, gathering information on the people we deem valuable as users and consumers, without thinking of the impact our technology has on wider communities or society.

It’s a trick that was missed when online banking came to pass. In moving to a digital-first approach, elderly and rural communities found local financial services, which central to their ability to not just access money but connect with other people too, were decimated. While banks across the U.K. have now shifted to a digital support model, it demonstrates how easy it can be to ghettoise already marginalised groups by excluding their data.

Photo by Samantha Borges on Unsplash

The importance of ensuring everyone’s voice is represented not only enables organisations to mitigate the risk of losing trust. It can also help avoid the cost of unpicking costly oversights too — redesigning and developing products once issue emerge, for example — but more importantly, can help organisations innovate better. Imagine what smart cities could achieve if, instead of pushing tech innovation or saving local councils money, they simply focused on community needs?

To do so, however, requires involving people and communities in the design and development of relevant technology. This, instead of pushing products and services that aren’t needed, and deploying dark patterns or addictive features to keep people hooked. Data ethics therefore also requires clear communication about the risks and benefits of providing data, and most importantly, giving people agency over when, how and what it’s used for.

Trust is a lot to do with your understanding about what’s being collected and why. The cleverer the tech gets, the harder it is for the average user to know what they’re giving. If you’re filling in a paper form you know what you’ve given. But if you go onto a website or open an email, there could be things going on in the background that are not explained.

Tricia Wheeler @ Co-op

It’s about giving people choice, but doing that starts with the organisations, rather than making decisions and determining outcomes on people’s behalf. Data ethics requires an organisational culture that values people and works with honesty, openness and compassion for those it serves. Doing so helps drive performance and achieve success, but requires shifting towards people-first practices, processes and policies to realise it.

What was made clear, however, was that there is no global or homogenous answer to doing data ethics “right” or building more trustworthy, risk reducing and innovative products and services. Every organisation’s users, consumers and tangential community is different, as much as the markets we operate in and the products we design, develop and collect data on to serve

Photo by Dylan Gillis on Unsplash

The starting point is understanding what good looks like for the people you serve, and what that means in the context of your organisation. It’s about constantly learning, adapting your processes and policies and iterating your products and services. The silver bullet you’re looking for around data ethics does not exist, though giving your employees, teams and departments license to challenge and create is a good indicator of how likely you are to succeed.

Another approach is to start small, or local, and recognise the complex interplay between your direct approach to users and customers and indirect impact on communities and society. To understand how you approach design and development plus data capture and usage intersects with one part of a person’s life, and how that interconnects with the wider context they live in.

The people doing this right are the ones who are shifting this paradigm and going back to community-led ways of working.

Facebook is trying to standardise how a human talks. They talk about connecting people, but it’s just about engagement. For me the people who are doing this well are doing it locally, on smaller-scales, and are all about community.

Phil Hesketh @ Consent Kit

Someone accessing benefits, for example, isn’t just about the money they receive, but about what housing they can afford and how the systems are set up to pay their landlords. You can’t create products and capture data in isolation, and landlords need to be considered as much as universal credit customers in how tech is created and data used. It’s looking at the interplay between different needs and impacts that risks are mitigated and innovation made possible.

Hard, as the world now operates globally, and legislation and regulation of the tech industry lags. Yet, it’s those organisations, getting ahead of how data is used to go above and beyond meeting need and thinking about how they care for people through their data that long-term, will succeed.

Photo by Markus Spiske on Unsplash

Organisations thinking about the wider benefit, beyond just their organisation’s bottom line, will see users and consumers stay long-term, when all it takes is the click of a button to switch loyalty. Facebook’s fears around TikTok’s popularity demonstrates this all too well. With conscious consumerism bleeding into the tech industry, how organisations signal their ethics will be of increasing importance too.

The danger is that if we do lose trust, then we lose that commonality to be able to work with each other, to interact. If we’re not trusting each other, then those everyday interactions break down. It’s almost that tech has to have those principles enshrined.

Julian Tait @ Open Data Manchester

Opening up conversations, communicating transparently and building meaningful relationships around privacy, sharing, security and consent will be central to building and sustaining trust in the tech industry. Thinking of people as being more than data subjects is just the start.

--

--

Lauren Coulman
Responsible Tech Collective

Social entrepreneur, body positive campaigner, noisy feminist, issues writer & digital obsessive. (She / Her)