Virtuous technology

Artificial intelligence can solve big social problems, if companies are held to the highest ethical standards.

The RSA
RSA Journal
6 min readJun 26, 2018

--

By Mustafa Suleyman, Co-founder of DeepMind

@mustafasuleymn

If we want to address society’s most pressing and persistent challenges then technology will have a major role to play. From climate change to inequality, time and again we have struggled to keep pace with a changing world as the complexities of seemingly intractable problems overwhelm our capacity to intervene.

Scientific breakthroughs facilitated by artificial intelligence (AI) could make the crucial difference by helping to discover new knowledge, ideas and strategies in the areas that matter most to us all. For example, we have already started seeing progress in improving the efficiency of large scale industrial systems; at DeepMind we have started using our technology to improve the efficiency of Google’s data centres, which has led to energy savings of up to 40% in cooling systems.

But increasing public concern about some elements of the technology industry should serve as an urgent wake-up call. Of course, many technology companies began with altruistic mindsets. But the truth is that good intentions, initially captured in well-meaning slogans like ‘making the world a better place’, are now met with increasing unease by commentators and the public.

To be clear, this is not a critique of purpose-driven businesses; I genuinely believe these types of organisations will be a key to our future. I do not doubt the sincerity of the motivations of the vast majority of the funders, founders and executives I have met over the years; these people really do want to ‘make a real difference’ and ‘do the right thing’.

Having said that, rising public concern should not be dismissed as simply about there being a perception gap between the developers and users of technology; there is something deeper at work.

There are at least three important asymmetries between the world of tech and the world itself. First, the asymmetry between people who develop technologies and the communities who use them. Salaries in Silicon Valley are twice the median wage for the rest of the US and the employee base is unrepresentative when it comes to gender, race, class and more. As we have seen in other fields, this risks a disconnect between the inner workings of organisations and the societies they seek to serve.

This is an urgent problem. Women and minority groups remain badly underrepresented, and leaders need to be proactive in breaking the mould. The recent spotlight on these issues has meant that more people are aware of the need for workplace cultures to change, but these underlying inequalities also make their way into our companies in more insidious ways. Technology is not value neutral — it reflects the biases of its creators — and must be built and shaped by diverse communities if we are to minimise the risk of unintended harms.

Second, there is an asymmetry of information regarding how technology actually works, and the impact that digital systems have on everyday life. Ethical outcomes in tech depend on far more than algorithms and data: they depend on the quality of societal debate and genuine accountability.

Making this happen has to be a collaborative effort, and requires new types of organisation that facilitate deep understanding of how complex algorithms work and their impacts on society. This takes courage, trust and the prioritisation of real debate and engagement over the comfort of our institutional roles, in which activists, governments and technologists are often more likely to criticise each other than to work together.

One of the new multi-stakeholder forums is the Partnership on AI, which brings together industry competitors, academia and civil society to discuss the ethics of machine learning, including issues such as fairness, transparency and accountability. The board has equal representation from corporations and nonprofits, making it a truly cross-cutting effort.

There also need to be new technical solutions that enable a wide range of stakeholders to have much greater visibility of how data is used. Interesting efforts are under way within companies, from the increased use of Transparency Reports, to technologies such as DeepMind’s Verifiable Data Audit (VDA), which aim to make all interactions with a dataset cryptographically logged and auditable. The VDA, for example, allows organisations and individuals to see what data has been used, for how long and for what purpose. Efforts like these will hopefully create real accountability between organisations using data and those they seek to serve.

Academics and nonprofits are also developing ways to make the impacts of algorithms easier to understand. For example, MIT Media Lab researcher Joy Buolamwini and the Algorithmic Justice League have created museum exhibits to increase awareness of the deeply disturbing ways facial recognition technologies often fail for individuals with darker skin tones.

This work is critically important.

As well as the ethical responsibility to avoid new harms, many in the AI field also see the potential for new tools to actually improve social justice.

In the realm of finance, for example, a sophisticated credit-scoring system — if built with fairness and accountability at heart — could be far more transparent than the historical alternative, where a bank manager would decide who gets a loan, without any real obligation to provide proper explanation, and no meaningful way to address any biases that may influence the decision.

Third, and this is by no means unique to tech, we need to address the asymmetry of motivation between market-based incentives and the other societal goals we aspire to. The standard measures of business achievement, from fundraising valuations to active users, do not capture the social responsibility that comes with trying to change the world for the better.

This disconnect starts early. There might be a lot of money in tech, but the vast majority of entrepreneurs still fail. Any founder hoping to get a new business off the ground has to convince investors and new hires of future growth, and then deliver that relentlessly. Doing this takes single-minded focus on the metrics that appear to matter, with little room to consider complex societal externalities or listen to naysayers.

That is partly why some of the world’s brightest minds gravitate towards the safest and most proven ideas and business models. They end up creating new services to personalise soda drinks when half a billion people do not have access to clean water, or new ways to order food by phone when more than 800 million people are malnourished. Why is it that we can go on a date with a stranger we meet on an app in minutes, but nurses and doctors carrying out life-saving treatments still use pagers and fax machines to communicate with one another?

We need new incentive-based legal structures — ones that put social benefit on the same plane as profit — to encourage more founders to take on real-world problems, and to do so with ethics at the heart. The private sector must bring the same innovation drive that has created so many amazing new products and services over many decades to the modern challenge of designing systems that are ethical and accountable. There is clearly room for innovation here.

None of this is easy. But with rigorous attention to technology’s capabilities, research into its inputs and impacts, greater transparency, and a reorientation of incentives, we can break through the complexity that makes society’s problems so hard to tackle. If we can deploy these tools broadly and fairly, fostering an environment in which everyone can participate in and benefit from them, we have the opportunity to enrich and advance humanity as a whole. All of us who believe in the power of technology must do everything we can to ensure these systems reflect humanity’s highest collective selves.

--

--

The RSA
RSA Journal

We are the RSA. The royal society for arts, manufactures and commerce. We unite people and ideas to resolve the challenges of our time.