Regulating for Responsible Technology: Introducing the Office for Responsible Technology
Today Doteveryone publishes Regulating for Responsible Technology: Capacity, Evidence and Redress. The paper outlines our vision for a new independent regulatory body that will direct digital technologies for the public good.
We recommend establishing an Office for Responsible Technology that will:
- Empower regulators with the capacity to hold technology to account. The Office identifies what powers regulators need, supports them to build specialist skills and looks ahead to help them anticipate emerging issues for their sectors.
- Inform the public and policymakers so that regulation is founded on an authoritative body of evidence about the benefits and harms of technologies and the public has a source of independent and understandable information. The Office will also create consensus around a future vision for technology to underpin the regulatory system.
- Support people to seek redress from technology-driven harms by ensuring people’s complaints are fairly handled. The Office will mediate unresolved disputes and ensure regulators learn from the experience of the public.
The value of building the public’s trust in technology by mitigating digital harms and empowering regulators to allow responsible innovation to flourish is enough to pay this back a hundred times over.
The body’s three core functions (more on each of these below) are significant undertakings in their own right and will require significant investment. Our estimate based on existing analogous bodies is that it will require in the region of £37 million a year to run — a significant sum, but a no-brainer if seen as of a long-term investment into our digital regulation infrastructure. The value of building the public’s trust in technology by mitigating digital harms and empowering regulators to allow responsible innovation to flourish is enough to pay this back a hundred times over.
There is a current political appetite for change and momentum through initiatives such as setting up the Centre for Data Ethics and Innovation. The Government should capitalise on this to make the Office for Responsible Technology a reality. But to do so they must radically rethink and expand the remit and ambition of the Centre to take on the Office’s responsibilities.
Doteveryone’s regulation journey so far…
Our Regulating for Responsible Technology programme has been a long, and fascinating, road. My back-of-the-envelope estimate is that we’ve spent over 35 hours (48 interviews at an average of 45 minutes each, if you were wondering), gaining invaluable insights from regulators, lawyers, standard developers, policymakers, human rights groups, futurists, consumer advocates, academics, technologists, data ethicists, authors and more. And that’s before you factor in the public consultations, workshops and infinite volumes of desk research.
We started by exploring what a new independent regulator for the internet would look like. In the middle, we published our review of the current landscape for digital regulation in the UK and debated its conclusions with an event in Parliament.
Today, we publish the next chapter.
The recommendations we make in this paper are a significant departure from our initial premise of a single command-and-control regulator for all of the internet. We came to realise that idea couldn’t work. In reviewing the regulatory landscape, and during the course of our consultations, we revealed that actually, all regulators are now grappling with the impacts of digital technologies in their sectors and therefore all parts of the regulatory ecosystem need to be empowered to positively shape digital technologies.
We also found that a current lack of evidence on the benefits and harms of tech is sensible debate and workable policies and that the public is struggling to get redress from tech-driven harms.
Below I summarise why we need a new Office for Responsible Technology, what it does and how it delivers on the three core functions. In the coming weeks, I’ll be following up with some more in-depth posts on each element of the system.
1. Empowering regulators
Some regulators, such as the Financial Conduct Authority (FCA) and the Human Fertilisation and Embryology Authority (HFEA), are experienced in anticipating the challenges on their horizons.
However, the majority have yet to embrace this forward-looking, innovation-friendly ethos, and many regulators also lack the remits, resources or technical expertise to respond to the challenges and opportunities that digital technologies present to their sectors.
As a hub of industry-standard expertise, the Office for Technology would build up the digital capacities of regulators across the board.
This would be done in a number of ways:
- Scrutinising regulators’ remits to make sure their resources and statutory powers are fit for purpose, reviewing how best to address the current blind-spots in online political campaigning regulation, for example.
- Collaborating with regulators to develop sector-specific solutions for technologies with application across the economy such as distributed ledger technologies.
- Leading foresight activities to alert regulators to emerging risks and opportunities, such as security threats associated with quantum computing
2. Informing the public and policymakers
We heard repeated pleas from the many and diverse individuals and organisations we spoke to over the course of this project, for a more rigorous evidence-base around online harms.
Without this basis for debate, the task of navigating the world becomes difficult for politicians and public alike.
Our People, Power and Technology research found 92% of the UK public would like a single place where they can find out what their rights are online, but only 28% feel they know where to go to for help. For policymakers, uncertainty around these issues lends itself to initiatives based on anecdote and intuition — such as questionable proposals such as banning mobile phones in schools show.
The Office would address these issues by:
- Commissioning and conducting research into the benefits and harms of technologies, tapping into the good work already being done in academia and groups such as the UK Council for Internet Safety
- Providing clear, accessible guidance to the public around online issues, alerting them to short-term risks such as malware outbreak and promoting a long-term culture change in their relationship with digital technologies
- Articulating the values that underpin regulation through deep public consultation and industry engagement. For example, understanding how society weighs up freedom of speech against freedom from abuse can help unpick debates around moderating hate speech on social media.
3. Supporting people to seek redress for technology-driven harms
Public apologies for digital wrongdoings from big tech companies are now a part of the furniture, with each new issue leading to minimal long-term change (as this Wired piece about “Facebook’s 14-year apology tour sums up).
And public mistrust of many online services is at an all-time low. Edelman’s Trust Barometer finds only 36% of the public trusts search engines and platforms (down 4% from 2017 alone), whilst research by ODI finds only 10% trust social media providers with their data.
Against this backdrop, we see an urgent need for the Office to address this accountability deficit and rebuild disintegrating public trust in digital technologies by:
- Setting best practice for handling public complaints about the impacts of technologies, auditing companies’ complaints handling processes and championing best practice
- Providing backstop mediation when companies’ internal processes fail, conducting alternative dispute resolution and enabling the public to seek redress where appropriate. This should also encompass collective redress in instances where issues have affected large groups of people, for example where an algorithm has discriminated against people of a certain gender or ethnicity
- Sharing insights to flag emerging issues and inform regulatory practice. Operating on the front-line of consumer harm, the Office would be well placed to flag spikes in complaints and misconduct to regulators and government.
Sounds like a good idea to you? Get in touch to discuss how we can make this a reality together at firstname.lastname@example.org