A model to help tech companies make responsible technology a reality
In a landscape of scandals and public concern over everything from data breaches, to social media harassment, to the impacts of automation, tech companies have come to realise that new, responsible ways of working are now a business imperative.
“Responsible technology is no longer a nice thing to do to look good, it’s becoming a fundamental pillar of corporate business models. In a post-Cambridge Analytica world, consumers are demanding better technology and more transparency. Companies that do create those services are the ones that will have a better, brighter future.”
Kriti Sharma, VP of AI at Sage — one of the UK’s largest tech companies, speaking at a recent Doteveryone event.
But adopting a Responsible Technology approach isn’t straightforward. There’s currently no roadmap, or even any common language, about how to embed responsible technology practices in practical and tangible ways.
That’s why Doteveryone has spent the last year researching the issues organisations face and we’re now developing a model that will help organisations do just that.
The 3C model helps to guide organisations on how to assess the level of responsibility of their technology products or services as they develop them.
It’s not an ethical bible which dictates right from wrong, but a framework which gives teams space and parameters to foresee the potential impacts their technologies could have and to consider how to handle them.
Our 3C Model of Responsible Technology considers:
- the Context of the wider world a technology product or service exists within
- the potential ways technology can have unintended Consequences
- the different Contribution people make to a technology — how value is given and received
We are developing a number of assessment tools which product teams can work through to help them examine and evaluate each of these areas in real time during the development cycle. The form of the assessments range from checklists to step-by-step information mapping to team board games.
For a technology to be responsible, it needs to understand and respect the ecosystems it operates within.
When you design a technology, you need to think about the big picture that technology will live in and the impact it can have. Designing products based on a single users journey that focuses only on a consumer interacting with a product in isolation needs to evolve. In practice, we all experience technology in different ways in different aspects of our lives and we need to incorporate this understanding into the design of products.
Doteveryone’s Digital Understanding model shows how we relate differently to technologies as individuals, as consumers, as members of society and as workers.
Responsible technology companies need to consider the many ways a user exists in the world and the different ways people will interact with their technology in these contexts.
And they need to accept contexts can overlap and simultaneously co-exist (ex: you can be at work and speaking to your family) and be comfortable with that messiness.
Responsible technology needs to consider the potential unintended consequences beyond the specific intended purpose of a product or service.
Companies have to learn how to anticipate what those consequences might be and every contributor to a tech product needs to participate in identifying them. This proactive approach mitigates the risk of products having a side-effect which had not been foreseen — an issue which many less thoughtful companies are currently facing.
But it’s important to remember that consequences don’t have to be negative — there are unexpected benefits as well as drawbacks and backfires — and we want to capture both the positives and the harms.
We’ve identified five categories of what can cause unintended consequences in technology. It’s not a comprehensive list but a starting point to help us move forward.
- Displacement: technology that changes, eliminates or creates jobs, industries or public services
- Unintended uses: technology that is used in ways that were not originally intended
- Changes in social norms: technology that is used how it was intended, but results in either changes or shifts to social norms or behaviours OR an amplification of our worst human instincts
- Weak Security, Support & Monitoring and Reliability: when the bugs, biases or holes in technology are exploited to cause harm
- Understanding blindspots: when companies are not clear about how their technologies work or the business model that underlies them. This can lead to a lack of agency on behalf of their users and backlash and a lack of trust when the reality becomes clear
It’s impossible to control for all of these — mistakes are inevitable and happy accidents frequent. And working through them does not mean that a company should bear sole liability for the consequences of their tech.
What matters here is the process to purposefully consider these potential unintended consequences and, once identified, put in place structures to minimise their impact and engage effectively with those affected. Exploring and sharing how this is best done will be a key part of building best practice and engaging cross-sector, as addressed in Contribution.
The last part of the 3C Model of Responsible Technology is Contribution.
In Doteveryone’s research we found the public has very limited understanding of the business models behind tech companies — for example around two-thirds did not realise that search, apps and social media create revenues from data.
Many consumer-facing tech products and services are essentially founded on an exchange of data for service. But that is not clear to the people who use them. A responsible technology needs to make this transaction explicit and give people enough information and choice to decide if this value exchange is fair.
But it’s not just about data. Products also rely on many hidden contributions in the form of micro-labour and informal labour. A better known example: those “I’m not a robot” captchas you’ve been filling out for years have been helping Google to grow its machine learning datasets and algorithms. This isn’t inherently a bad thing. But responsible technology ensures that people are aware this is what’s happening, understand what they’re contributing to and have a choice about it.
Contribution also examines what the technology itself or the learning from its development can contribute to industry best practice, and how contributing to better cross-sector collaboration can result in safer, fairer products.
Methodology and next steps
We’ve arrived at Doteveryone’s 3C Model of Responsible Technology through a process of research and iteration, based around:
- Doteveryone’s People, Power and Technology research which explores the UK population’s understanding of the impacts of technologies — how they shape people’s lives and society as a whole
- Insight and input from consultation with a wide range of people across industry, academia government and regulators, and from the community of organisations that share Doteveryone’s objectives.
- Ongoing testing with tech companies and professionals, starting with our Trustworthy Tech Partners who helped test our initial ideas
When we first discussed the 3Cs of Responsible Technology we spoke about context, contribution and continuity, but through this iteration found that our new pillars of context, consequences and contribution were better able to address the most urgent issues.
From September we’ll be conducting rolling testing of the model and the assessments we’ve developed.
We welcome input and feedback so please get in touch if you’d like to be involved.