To mend our society we need a new social contract based on collective responsibility, diversity in tech and listening to others’ experiences
--
— Speech to Digital Freedom Festival in Riga on 15 November 2019
I want to give you a positive vision of the future. All too often you’ll hear about how society is broken and it’s technology… well social media… well twitter that’s broken it. You’ll also hear despair about populism and the state of politics.
I want to make the case why, in this world, we should still care. As well as democratising information and communication, the tech revolution has also democratised responsibility. We need to root our notions of responsibility and ethics in the tech industry in diversity, inclusion and listening to others’ experiences.
Social media and a vision for society
Social media has given us the alt-right in the US and the spread of violent far right extremism but it’s also given voice to culture that might never have found its feet.
One of the Oscar favourites next year is likely to be a small South Korean film about social class called Parasite.
And why is it gaining a loyal fan base in the US and making millions at the box office? Well it’s partly down to a meme called the Jessica Jingle.
Too often are we forced to decide between two visions of society and technology. Either the libertarianism of west-coast California where information should be set free or a techno-authoritarianism of more restrictive states.
This polarisation does us no favours and leads to lazy thinking which often ends in political reactions such as: do nothing; or ban, block, restrict.
Stasis is also not an option. So we need to come up with radical, but sensible policy that grasps the power of technology, while mitigating its risks.
There are some tech challenges we need to face
1. Technology is disruptive. It forces us to rethink things we take for granted and adopt new approaches.
2. The pace of technological change is accelerating. And it’s often unpredictable.
3. Technology changes power structures. It can concentrate power and it can democratise it.
To respond to these challenges we need a new social contract where the shared responsibility of everyone in our society is recognised and a shared set of values are articulated.
This means responsibility whether you are a citizen on twitter, a start-up building a new app, a multi-national ‘big tech’ company or a government.
Policy, legislation and international agreements are necessary to tie these together and to create the right incentives.
What do I mean by responsibility and how can individuals make a difference?
We are better off thinking collectively: how can we all make things a little less worse?
Fake news, election interference, social media hate are fundamental challenges to modern democracies. But these complex problems can often feel overwhelming and insurmountable. We often think of ‘fixing’ things. As if we could flick a switch and solve all the ills on the internet overnight. We are better off thinking collectively: how can we all make things a little less worse?
The great thing about the world in which we live is the impact individuals can have.
Social media is the clear example of the democratisation of public communication. But rather than seeing social media as an inherent evil we need to reframe and think that governments and citizens haven’t caught up with the technology.
We all are responsible for making things a little better. It’s twitter and Facebook's responsibility to address problems on their platforms, yes, but it’s also governments’ to set the framework for society, civil society’s to critique and feedback AND users to not abuse.
Why is diversity and inclusion so important in technology and design?
Here are some practical examples of what taking more responsibility means in a business and media context. And this applies to everyone. If you are a designing a product or service, you are designing from a political and social perspective and have a moral obligation to do it in a way that enhances society and doesn’t create new harms.
If there’s one message in this speech, it’s in order to take responsibility as digital citizens we need to listen to others’ experiences and then do something about what we’ve learned.
The Digital Freedom Festival’s commitment to equal speakers is a great example of doing something about it. Gender equality at conferences like this is a long way from a given and it’s only by listening to a diverse range of voices can we get new inspiration.
Primavera music festival in Barcelona back in June, is for me the best example of how you can make a difference by doing the little things.
The event hit the headlines by being the first big music festival to have a gender equal lineup. Not only was it possible (many naysayers said it wasn’t), it was their most successful festival in their history, with tickets selling out for the first time when people flocked to see the Catalonia-native Rosalia sing on the Saturday night.
But that’s not the only thing they did. They embedded a principle of equality in every aspect of the festival.
All the bar staff were trained to deal with sexual assault and aggressive behaviour and signs were posted across the site about what to do to stay safe and look out for others.
The tag line for the festival was: “nobody is normal”. The Festival organisers listened to the experiences of women, people of different sexualities and gender identities and were explicit in their support.
They also made sure everyone else listened and were confronted with the idea of a shared responsibility for a safe and inclusive environment.
Diversity and inclusion is important for all of us and raises the standards
Until August I worked at Sky as Head of Digital Policy.
I care about doing the right thing and ethical, responsible behaviour. But I’m also a cis white man, so (1) it’s not for me to own that agenda and (2) it’s really important I listen to other’s experiences.
I strongly believe that if you want to address some of the worst problems online, in media and technology you need diversity and inclusion.
For example, technology design is not neutral. Media and cultural production is not neutral. If it’s made for a neutral audience or average customer we need to question who that neutral person is: historically they’ve been white, straight and male.
This is why diverse teams are incredibly important to improving both the quality of products, services and content as well as looking after the interests of people who’ve traditionally been excluded.
It’s up to all of us to listen to the experiences of others who might be excluded, include them and their views in whatever we’re doing.
We faced this at Sky when we were building a new parental controls app Broadband Buddy, which gives parents more granular control over how and when their children use the internet.
Parental controls can be a great tool for protecting children from harmful content and parents wanted very granular control wherever possible. But children have rights too and the UN Convention on the rights of the child sets out various rights including protection from abuse and privacy.
My personal experience means I know that LGBT sites are disproportionately blocked by parental control apps and often wrongly categorised as pornography when they have no explicit sexual content. This can deprive teens who are exploring their sexuality and gender identity of vital information. It’s also something that people without those experiences might not know.
But more importantly my experience teaches me that there’s always experiences and perspectives I’m missing; that I need to make an effort to search out the gaps in my knowledge.
It was a conversation with a domestic violence charity in the UK that made a difference to our product. Refuge, a UK charity told me that one in four women experience domestic violence in their lifetime in Britain. In Latvia it’s estimated to be 1 in 3.
This is astonishingly high. I also learnt that perpetrators are increasingly using technology to facilitate their abuse including tracking and controlling internet access.
The unfortunate conclusion for any mass market product is that your customers will very likely be either victims of, or engaged in domestic violence. It’s pretty much guaranteed, so there’s no point ignoring it or claiming exceptionalism.
For Sky’s Broadband Buddy this meant unfortunately it could be used by someone to control their partner or children. Sky can’t solve this social problem, but it can take steps to make sure it doesn’t make it worse.
Our tech solution was to make sure there were ‘escape routes’ for people in those situations.
In the offline world if a child calls the UK charity Childline that call won’t appear on the call log on the phone bill.
Sky created the digital equivalent in Broadband Buddy. There are a number of websites of national helplines related to child safety, rape, domestic abuse, and LGBT+ advice that will never be tracked in history by Broadband Buddy and can never be blocked.
For example, if a child wants to access childline.org.uk in the middle of the night, they can, even if their internet access has been turned off in the app.
An ethical framework is ok, but start with getting people to ask questions
The solution at Sky was to avoid the trap of focusing energy on a ritzy ethical framework for technology and instead look very specifically at a few vulnerable groups and their experience and to get product teams to ask questions.
Instead of trying to answer what the concept of fairness means, for example, and then write it in a 30 page handbook to give to designers. The more effective solution is to prompt people at all levels to ask questions about people’s lived experiences that they don’t know about.
You can apply this to AI as well by the way. A machine learning expert recently talked about AI in the following way:
An AI system has two major sources of knowledge: (i) data and (ii) prior knowledge encoded by the engineering team. If we don’t have much data, then we may need to encode more prior knowledge.
This is a very effective analogy for what I’m talking about:
(i) think about the data as diversity of teams and (ii) the prior knowledge encoded as the lived experiences that are listened to and included. Both need to be diverse.
June Sarpong has a great book called Diversify which has some useful ways we can all be more inclusive. And I would directly apply these to technology and service design.
Responsibility and ethics should be everybody’s business
Another way to think about it would be to look at different areas like these:
… and then to (1) find people to tell you about their experiences and (2) write out the questions these people challenge you with along with the statistics that tell you that this particular group matters.
Although it’s not the whole story, simply asking questions and listening to experiences is an important first step.
A new social contract
So to come back to my original idea of a new social contract that’s required around technology and society.
One aspect of that new contract could be to check your own privilege and to understand the impact of your own actions. In a globalised world everyone has the potential to do good and to do harm. Thinking about marginalised people and bringing them and their voices to the surface must be a part of that new social contract.
It’s up to Government, citizens and technology companies to come together to recognise that responsibility and make it a reality
About the Technology team at the Tony Blair Institute for Global Change
I’m Head of Tech and Society in a new team set up at the Tony Blair Institute tasked with creating a positive vision of the transformative power of technology.
The Institute is a centre ground organisation that believes in order to counter the forces of populism you have to give people (citizens and voters) a new vision of radical change. You need to build a picture of what society could look like and the transformative policies that can make a difference to peoples’ lives.
Our job is to equip the world’s leaders to master the revolution in technology so they can access its benefits and mitigate its risks.