The Office for Responsible Technology: Informing policymakers and the public

Illustration: Elin Matilda Andersson

Earlier in October we published Regulating for Responsible Technology: Capacity, evidence and redress. In the first in a series of four posts, I roughly sketched out our proposal for a new independent regulatory body — the Office for Responsible Technology — to empower regulators, inform the public and policymakers around digital technologies and support people to seek redress from technology-driven harms.

This post explores the second of these functions — how the Office can provide clarity and guidance to policymakers and the public around digital harms and opportunities.


Doteveryone’s vision for an Office for Responsible Technology is founded on a systems approach to regulation. This acknowledges that complex and dynamic digital technologies are beyond the control of one institution alone.

A few ways our institutions create digital change. Credit: Rachel Coldicutt

Technologies are instead shaped by a constantly evolving web of interactions between the state, the public, civil society* and the users, developers and owners of digital technologies. Regulation that marginalises any of these groups is unrepresentative of society, and will ultimately have lopsided and unfair outcomes.

So whilst I explained in last week’s piece how the Office would have a role in strengthening the regulators themselves, here I dive a little deeper into how it can empower and engage civil society, the public and policymakers by:

  1. Articulating a vision for technology and society through ongoing consultations with the tech sector, civil society, academia and the public
  2. Evidencing the benefits and harms of digital technologies with rigorous research and evidence reviews
  3. Informing and empowering the public with co-ordinated media campaigns and engagement

*In this context I use ‘civil society’ in the broadest sense — encompassing the social sector, think tanks, academia, rights groups, activists and any organisations whose primary purpose is to advance the interests of society.

1. Articulating a vision for technology and society

“An algorithm is an opinion embedded in math” Cathy O’Neill

Technology is not neutral. Values are baked into digital tech at every stage of their lifecycle, from investment strategies through design choices to the multiplicity of ways they are adopted by society. The important question is not if, but whose values, are represented in and by these technologies.

As powerful actors from Silicon Valley to the Chinese Politburo continue to impose their world-views upon the digital world, there is an opportunity for the UK to show an alternative path by setting the gold standard for responsible technologies that have society’s interests at heart.

The Office for Responsible Technology can play a key role in articulating this vision, conducting deep consultations with the public and other stakeholders to define these responsible technology principles.

This ongoing dialogue can explore society’s views on ethical trade-offs in tech. How is free speech is valued in relation to freedom from abuse on social media, for example?

It would also help to gauge opinion on the future applications of technology that have significant societal impacts, such as the automation of care work.

“Reading @doteveryoneuk’s v. interesting call for an office of responsible technology. Can it go further on public engagement? Alongside high level principles, how about regular dialogue between the public and regulators on the detailed work of regulation?” @Tom_saunders

These principles will help to reframe the approach of regulators, technologists and government. During the consultation for this work, some raised concerns that programmes led by regulators to encourage innovation in their sectors favoured tech with economic potential over social benefits. By looking through the lens of Responsible Technology, regulators would have greater license to intervene when tech’s impacts don’t fit neatly into a narrow definition of “consumer welfare”.

The insights from consultations can be incorporated into toolkits used by coders and technology developers to promote responsible design, and help the government and local authorities’ to ensure funding priorities for digital technology align with the public’s interests.

2. Evidencing the benefits and harms of digital technology

A lack of evidence undermines both effective regulation and innovation. Ongoing ambiguity around the nature of online harm muddies the public and political debate, lending itself to policymaking by anecdote and outrage. Uncertainty about the risks and opportunities of technologies means regulatory action and public investment lags far behind fast-moving technology.

The Office for Responsible Technology would provide impartial, authoritative and rigorous body of evidence around digital benefits and harms.

“As you’ve heard from many academics and clinicians the evidence just isn’t there at the moment. Legislation and policy-making really needs to be based on solid evidence. Otherwise there’s a danger of having unintended consequences” Google’s Claire Lilley addressing the Science and Technology Select Committee on 16 October

But it would not seek to reinvent the wheel. Before conducting any independent research itself, it would first gather the existing evidence and considerable research already being done in academia, civil society and networks such as the UK Council for Child Internet Safety. This will highlight where the gaps are and inform the body’s priorities going forward.

In areas where evidence is thin on the ground, the Office will:

  • Conduct original research using in-house expertise
  • Commission and fund third-party research where others can deliver better results (for example, if a study involves vulnerable groups that require specialist skills)
  • Convene research networks to share approaches for researching complex and evolving technology issues (i.e. algorithmic transparency and discrimination)
  • Independently evaluate interventions to address digital harms (such as “inoculating” people against online misinformation).

To enable this, the Office must have statutory powers to compel the owners of digital technology to get the information it needs to make the research possible. This information would, where necessary, be shared confidentially to ensure intellectual property rights are respected and encourage a culture of collaboration, not confrontation.

3. Informing and empowering the public

As a parent reading headlines that the effect of screen-time on your child is both “like a gram of cocaine” and “can be good for kids”, how on earth can you make an informed decision? Without robust evidence and clear messaging around online harms and opportunities it’s fiendishly difficult for the public to navigate the digital world.

As members of communities and as a society, we must also have the knowledge to critically engage with digital technologies. Because, as the recent lynching of five innocent people in India following the viral spread of false information on Whatsapp shows, on a societal level, the stakes for building digital understanding are arguably even higher.

To build up public digital understanding, the Office will:

  • Deliver short-term communications to alert the public to immediate risks of digital technologies through working with bodies such as the Information Commissioner’s Office and the National Cyber Security Centre to alert them of large-scale data breaches or ransomware outbreaks
  • Lead longer-term campaigns to promote positive change in the public’s relationship with digital technologies. These engagements could, for example, seek to address current blind-spots (identified in our People, Power and Technology research) around how digital services are personalised and funded
  • Raise awareness of emerging digital risks and opportunities. If the public is to realise the potential of new innovations such as “data trusts”, they will need clearer guidance on what they are and how they stand to benefit from them — The Open Data Institute note this is a term that’s currently used very ambiguously.

Public understanding is not a panacea however, and when the average person would currently need to spend 250 hours a year reading all their online Ts & Cs, lumping sole responsibility on them to keep up with the intricacies of digital technologies is both unreasonable and unfair. But an empowered public that knows when their rights have been wronged, and can fully grasp the opportunities digital tech open up, is a vital bottom-up driver of change

“I believe the technology and innovation are not only essential for progress, but can be powerful drivers of equality, and public good… I would like to see an Office for Responsible Technology play a proactive role in fairly communicating the benefits of technology as well as just access to information for those with concerns.” Ben Maynard, in an email to Doteveryone.

Sounds like a good idea to you? Get in touch to discuss how we can make this a reality together at hello@doteveryone.org.uk