Featured Stories

Data Violence and How Bad Engineering Choices Can Damage Society

Cultural harms can go well beyond search results — which can be bad news for vulnerable communities

Listen to this story

--:--

--:--


Is “Data Violence” Really a Common Problem?

In 2015, a black developer in New York discovered that Google’s algorithmic photo recognition software had tagged pictures of him and his friends as gorillas.


Are Engineers Asking the Right Questions?

Discussion of all these cases quickly veers toward fairness, accountability, and transparency. A kind of forensic instinct kicks in: How, technically, did this happen? How, technically, can we prevent it from happening again?

If you have the temerity to insert your work into a political issue that…doesn’t immediately affect your life, you should also be prepared to accept the consequences — or, at the very least, answer a few hard questions.

“It’s almost never possible to evaluate the utility of an algorithm by looking at the code or measuring it against a mathematical formula,” computational social scientist J. Nathan Matias points out. “To evaluate the risks or benefits of an algorithm, we need to study its impact in people’s lives, whether in controlled lab conditions or in the wider world.”


This Is What Institutional Prejudice Looks Like in the Digital Age

At the turn of the millennium, the U.S. Food and Drug Administration deployed an automated system to detect fraud in the Supplemental Nutrition Assistance Program, a federal aid program that provides food-buying assistance to people on low incomes. In 2002, that algorithm inadvertently banned multiple Somali markets from accepting benefits.


Reinforcing Anti-Immigrant Rhetoric

Less than a month before that Harvard computer scientist awkwardly defended his work on an automated gang crime system, President Donald Trump delivered his first State of the Union address.

Those choices are built on assumptions and prejudices about people, intimately weaving them into processes and results that reinforce biases and, worse, make them seem natural or given.

They might not actively endorse the current administration’s policies personally, but they’ve still made those ideas feel, as Galtung put it, “not wrong.”


How We Can Fix This

Where should we start repairing these systems and the culture that produces them?

Neither distributional nor representative forms of harm can survive without a cultural backdrop that enables them.

The sentiment is a good one — we should certainly work to make the world better for lots of people and not just a few. But this kind of call to action risks hollowness if it doesn’t focus on those people being hurt by discriminatory systems. As author Mandy Henk noted, our subjects aren’t some amorphous “they.” Instead, our discussions need to be grounded in the political and cultural contexts that make these people vulnerable or marginalized in the first place.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store