Or, why a social scientist could be the most important person on your product team.
Ensuring more computer scientists have a rudimentary understanding of ethical decision-making will certainly broaden their horizons as individuals, but it’s not enough to transform how technology is made or how it affects society. A more radical proposal is to train arts and social science graduates in product and experience design, so that people who put people first have the skills to shape technology more confidently.
Teaching computer scientists ethics is not a silver bullet
There’s a prevailing argument that all computer scientists should be taught ethics as part of their undergraduate training. And while it’s a vital evolution in how computing is taught, it’s important to recognise that it’s not a silver bullet that will change the technology industry.
Ethics, on its own, won’t make tech more responsible
Certainly, teaching computer scientists and software engineers about ethics is a good idea. Such a good idea, in fact, that Doteveryone is already working with a UK university to embed the 3Cs of Responsible Tech as the basis of as the basis of the ethics component of an undergraduate Software Engineering project. But it won’t, on its own, make technology more responsible.
A more transformative change both for boards and for product teams will be changing who makes decisions and how those decisions are made. After all, responsibility is a team sport and “ethics” are not a checklist. In the short term, this means product teams need help to embrace distributed decision making and make room for social-science expertise.
To put it bluntly: computer scientists and software engineers need to not only learn the basics of ethical decision-making, they also need to work in environments that encourage trust and respect in non-technical or less-technical opinions. It’s not just who’s in the team that counts; it’s who makes the decisions.
It’s not just about who’s in the team, but who makes the decisions
Doteveryone is experimenting with ways to help teams shift from simply being multi-disciplinary to making multi-disciplinary decisions. Software engineers who are taught to recognise ethical dilemmas also need to embrace ambiguity and respect that not every hard problem can have an elegant solution.
We’re launching our Responsible Tech assessments in Spring 2019 (come along to our Responsible Tech conference in January to learn more). These are designed to help teams and boards capture their values, and see how they play out in real-time, throughout the product development cycle. The assessments offer a simulation of how a product or service will behave in the world and help test how the logic of a feature will work when it encounters the messiness of real life.
The arts and social sciences must become essential voices in tech
But in the medium term, a bigger shift is needed: the arts and social sciences must become essential voices both in the technology industry and in the predictions that guide investment and entrepreneurship.
Knowing what to do with tech must become at least as valuable a skill as knowing how to make it.
There is a practical consideration here: within the next few years, automation will make it possible for social scientists to be the most important members of most product teams. Products and services like the open-source machine learning TensorFlow and the community Glitch will open up the means of producing technology to more non-specialists. The hard work of computer science will still exist, but it is likely to become an even more rarified and specialist discipline.
While this poses a risk of centralisation of power into a few productised methodologies, in this scenario knowing what to do with technology will be at least as valuable as knowing how to make technology. And the hard work of innovation will continue to be as much about people and change as about making new things.
Too often “ethics” represent an optimistic afterthought
In the rational, data-first world of computing, and the associated rush to make a return on the investment in higher education, it is easy to forget the importance of debate, exploration and imagination in creating human meaning. In his book Misbehaving, Richard Thaler — one of the originators of behavioural economics — explains most economic models don’t work in the real world because they “replace homo sapiens with homo economicus”. The same is happening in computing, which often expects humans to behave in knowable, predictable ways – whether that’s by only using software for the purposes it was created, or conforming to the biases or stereotypes hidden in data.
Ethical considerations are often bolted on at the end of rational, data-driven workflows that do not represent how life is lived or technology is experienced. Too often “ethics” represent an optimistic afterthought, not a real change in priorities.
Instead of just prioritising checklists, we need new models for human-centred computing that begin with people and ethics, that make space for debate and ambiguity, and don’t regard hockey-stick KPIs as an end in themselves.
Revaluing humanities, arts and social sciences
Without a supply of social scientists, debaters and creative thinkers, technology does not stand a chance of adapting to humanity. The inequalities created by data-driven decision making will only become deeper unless multi-disciplinary thinking and debate are embedded in technology production.
A first step to achieving this would be to change and challenge our assumptions about what kinds of education are valuable. Coding on its own will not be enough. Learning how to cope is a survival tactic. Critical, imaginative thinking is one of the most valuable human skills, and it can be taught much more effectively than it can be automated.
So it is time to revalue the humanities, arts and social sciences. It’s time to invert the learnt hierarchy, in both education and at work. Rather than constantly repacking the arts as an “A” to put into STEM, it’s time to focus on rearranging the letters and put the “T” for technology into the arts and social sciences.