How Automated Tools Discriminate Against Black Language

Platforms can have the power to moderate not just content, but language itself

Anna Chung
10 min readFeb 28, 2019
Credit: Caspar Benson/Getty

OOne of my favorite aspects of social media is coming across amazing work by activists, creatives, and academics. I get especially excited to see work by fellow women of color, whose perspectives are often left out of mainstream media and activism. So naturally, when I discover that posts by women of color are being filtered out of my feed, I am skeptical and upset but not surprised.

This recently happened as I was using Gobo, a social media aggregator and filtering platform created by my colleagues at the MIT Center for Civic Media. Gobo was created to address the lack of knowledge and control that people have over how their social media feeds are filtered. It aims to give control back to the user by allowing users to adjust how their feeds are filtered along six categories: politics, seriousness, rudeness, gender, brands, and obscurity. When the user adjusts the filters for each of these categories, posts will either get filtered in or out of their feed, and Gobo will tell them why.

I recently joined the Gobo team to understand how effective these filters are and how they could be more useful. As it stands, Gobo is less a product and more of a provocation for…

--

--

Anna Chung

Researcher & Designer at MIT Center for Civic Media / Comparative Media Studies