The Mundane Design of Horror: A Call to Action

Mike
Nope.
Published in
10 min readMar 7, 2018

--

‘The greatest crimes in the world are not committed by people breaking the rules but by people following the rules.’

Banksy (Wall and Piece)

Warning: This is going to be an angry article. If you don’t like examining your privilege, then you need to read this. If you believe social progress is some linear fact, then you need to read this. If you get uncomfortable when your role in dehumanisation, exploitation, and destruction is made apparent, then you need to read this. If you’re seriously pissed about the state of things, take a deep breath, we’ll get through this.

Marginalised people experience discrimination because of existing structures, norms, and dominant beliefs. Encoding technologies with these norms will allow discrimination to happen with more efficiency, and even less consideration. This can only quicken the pace of injustice-making, while enabling post-hoc rationalisations of discrimination as if it were an objective fact. Making systems of domination ever more durable will only make injustice ever more prevalent.

Tech workers are central to this next phase of injustice-making; many of us don’t realise that; many others actively deny it. The cop-outs are rife. We’ve all heard them:

I’m just an engineer. I’m just doing my job. We’ve got to cut costs. We need to increase revenue. We have to reduce crime. We have to hit our Key Performance Indicators. We need to modernise. We need to privatise. We’ve got to keep clients happy. How can we function as a business if we don’t do this? We need to get with the times. What’s the alternative? My boss made the call.

These excuses are what let injustice slide. But they’re also reflective of how disempowered we are. We can’t combat injustice on our own.

Automating Injustice

Injustice manifests in many complex ways. But the most obvious way we see it is in the ‘justice’ system. When components of the justice system are made more efficient, more objective, and become automated, this is probably bad news.

Minority groups are over-represented in prison populations in America (~35% Black Americans), New Zealand (almost 60% Maori) Australia (~84% Aboriginal and Torres Strait Islander peoples in the Northern Territory), and many other parts of the world. We can’t expect this to decrease when the objective is to make it happen more efficiently.

This is already happening in Australia with predictive policing algorithms. The Suspect Targeting Management Plan (STMP) is being used to target people as young as 11:

The STMP algorithm calculates how likely it is that a person will offend, and then classes them as ‘extreme risk, high risk, medium risk, or low risk.’ What exactly this criteria consists of isn’t publicly available. But it’s the public who are impacted. Those subject to the STMP have experienced constant police harassment, despite some individuals not having any prior convictions.

What’s concerning, albeit unsurprising, is who the STMP targets. It’s biased towards young people, but it especially targets Aboriginal and Torres Strait Islander Peoples who are 44% of STMP nominees, but only around 3.3% of the total Australian population. Being an STMP nominee ‘is being used by police as a substitute for having “reasonable grounds to suspect” a person has committed an offence.’ This is a classic case of being treated as guilty just because you’re a person of colour.

To get an idea of what that feels like, watch this warm interaction between an armed white male police officer and a person of colour (and note your bodily response to it, keeping in mind that people of colour are being murdered by police for much less than this):

We could dismiss this as just some racist prickery. But one armed and racist prick is still one too many. And it’s not just one guy, it’s systemic. Imagine how much more confident this police officer would have been if his racial profiling were backed by an ‘objective’ algorithm. Imagine what Officer Prick might have done if an algorithm told him that this person of colour had probably committed an offence.

An algorithm based on racist profiling would enable more armed racists with fragile egos to justify their behaviour. It would make their baseless hostility seem reasonable. It would lead to guns being pulled out a touch sooner.

Fuck that.

Yet this is the direction the U.S. is heading by using AI to ‘identify’ gang members:

How these technologies will be used is evidently not something those who are making it think, or even care about:

Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldn’t be sure how the new tool would be used. “I’m just an engineer,” he said. Lemoine quoted a lyric from a song about the wartime rocket scientist Wernher von Braun, in a heavy German accent: “Once the rockets are up, who cares where they come down?” Then he angrily walked out.

So how does this happen? And how might we address it? Zygmunt Bauman has a lot to say on this using the confronting case of the Holocaust. I’ll bring his analysis together with the conduct of tech workers to help you see how horror can emerge from what seems like moral and reasonable activity.

You Won’t Look so Neutral in the History Books

Tech workers inside project teams are so focused on the task at hand. We’re quick to fetishise material progress, technological upscaling, and efficiency, as if they all defined good design. But we fail to see how these things can also enable horror:

Individuals specialising in components of a whole are alienated from the outcomes of their work and the people who are impacted by it. That’s how rational conduct can produce immoral ends: by slicing up complex things into manageable categories that are unrecognisable from their real-world forms. It’s what science does so well. Designers are great at it too. And it’s how the Holocaust was made possible:

Being a good person doesn’t make you immune to producing horror. But the production of horror depends on your compliance in an unjust system. That’s what you’ll be remembered for, and it won’t look so neutral in the history books. Most SS members would have passed psychiatric tests. But we’ll remember them for being Nazis.

Obedient, focused, disciplined, loyal. That’s what Nazis were. They were good workers. They complied to organisational rationale. Tech workers do too. Great designers are great collaborators. They can function as part of a team. But tech workers increasingly look like social engineers for fascist, coercive, and exploitative entities. Maybe your teams and organisations are fundamentally fucked. Maybe the people who ultimately drive and benefit from our activity are counting on our ignorance.

Tech workers need to stop blindly following the rules, celebrating the process, and thinking of their work (and tech) as neutral and objective. ‘We can’t predict how people will use it’, you might reply, in a privileged defense when you won’t suffer the consequences of its misuse. Sure. But if what you made enabled horror, then you enabled horror.

Technology is increasingly alienating us from the consequences of our actions. It is increasingly externalising the costs of privilege, and increasingly obfuscating the connections between what we do and how it relates to injustice. There’s nothing immediately immoral about pushing a button, so long as you remain ignorant to that button firing up a system that subordinates and eradicates a people.

We need to examine our work as part of a larger whole. If doing that is crippling, and causes inaction, then that’s probably what we need. Stop fetishising ‘doing’ when you aren’t even trying to grasp the totality of what is being done.

Designers need to engage with their work through a lens of power. We don’t just solve problems, we empower and disempower. We need to get better at identifying who we’re making more powerful, and at whose expense:

With Ethical Design, We can (Still) do Just About Anything.

Ethics: A set of moral values, principles, and knowledges that govern conduct

While I’m encouraged by the newfound emphasis on design ethics, I’m not confident it’s going to do the job. Design ethics has been reduced to little more than ‘not doing bad’.

The general sentiment is admirable:

‘Don’t exclude. Don’t manipulate. Don’t destroy. Be considerate.’

But they also glaze over deeper motives that nevertheless remain the same. If they were more detailed and accurate, they’d probably sound like this:

‘Don’t exclude people when we can actually make them our customers and expand our market base. If they don’t have money or the internet, too bad.’

Don’t manipulate and exploit people, *nudge* them (into using our product and making us money).

Go digital to externalise costs, but tell everyone it’s about making services more accessible, convenient, and sustainable (and if they refuse, tell them it’s the way of the future).

Let’s assimilate diverse groups of people into the white capitalist ableist heteropatriachal sphere; don’t tell them that they’ll mostly be subordinate to white men who’ll never stop talking about how progressive they are.’

If this is ethical design, then we have some shit morals. I’m not sure how noble it all is. Nothing much has changed; and people will still get away with a hell of a lot. It’s pretty clear that this shift doesn’t seek to challenge the fundamentally exploitative, destructive, and expansionist logic of this culture. The design ethics trend is in danger of turning into the latest cop-out and free pass. It’s adopting the same kind of passivity as Google’s ‘Don’t be evil’ motto — which has clearly been pretty malleable. We have to be careful not to celebrate red herrings. We need to do more.

Stop Being an Ethicist. Start Being an Activist.

Activism: Advocating energetic action; advocating the abandonment of neutrality

Design culture is constantly evolving. The first notable design turn was about considering user needs. Then we realised that we were actually designing experiences. Eventually, designers became human-centred.

Then, design culture confronted its own hypocrisy with a push for empathy. But we weren’t required to act on the empathy we generated, and execs don’t have time to feel. Their job is operational, they’re most moved by cold, hard numbers.

Now, in striving to address our concerns with the nature of being designers in a capitalist paradigm, we’ve arrived at design ethics. And it’s a lucklustre. It’s the liberal turn we’re comfortable with, but it’s not the radical change we need.

The next logical step is for us to become activists. And it’s come way too late.

We need to be more active if we are serious about creating meaningful change. Examine the purpose and cost of new and existing systems. But injustice won’t go away just because we’re aware of it. It needs to be confronted. We need to combat this in every way we can. Injustice doesn’t disappear when we turn down a job or refuse to work with a certain client (and who but the most privileged can afford to do that anyway?). Avoidance is a passive act. You are still complicit. You need to take action.

You can’t just refuse to enable injustice and expect others to do the same. You have to subvert its course. You can do this inside and outside of your profession as a tech worker. You don’t need to be in the relevant industry or field to find its developments troubling.

And you can’t do this alone. We need to make movements, not heroes.

‘It is essential to resist the depiction of history as the work of heroic individuals in order for people today to recognize their potential agency as a part of an ever-expanding community of struggle.’

Angela Davis (Freedom is a Constant Struggle)

There is no hero to this story. There is no saviour. Not tech. And especially not the billionare class. That wealthy white guy’s not going to come to our rescue, And quite frankly, fuck him.

We can’t hope that executives, politicians, CEOs, and other ‘leaders’ can be convinced to do something that will compromise their wealth, power and privilege. Because of course they won’t, they have the most to lose here.

We need to remind them that their privilege, power, and wealth is something that we create with our labour. If they won’t stand with us, then they make their stance clear, and it is up to us to stand together.

Change starts right where we stand, especially when we stand together.

Form collectives. Organise with co-workers. Unionise. Show solidarity. If we aren’t listened to, then we need to act.

We need to withold our labour as a collective force. Strike. Be a whistle blower. Engage in direct action. Start picketing. Join with other campaigns against injustice. Don’t seek to maintain your privilege if it means maintaining injustice. An injury to one is an injury to all.

We need to stop begging for change and start demanding it. We need to assert that things are going to change.

Being ethicists isn’t enough. We need to be activists.

--

--

Mike
Nope.

Human-centred design | Anthropology+Sociology PhD cand. | Critique design, power, capitalism, modernity | Black radical anarchist intersectional feminist things