Frida Orupabo, Untitled (2018), collage with paper pins mounted on aluminum 106x80cm. Photography: Carl Henrik Tillberg.

Death By Bias. How Algorithms Systemize Discriminatory Practices.

Empowering Data Literacy within the Black community

Álise Otilia Ramírez (they/she)
The Startup
Published in
6 min readSep 9, 2020

--

Google advertisements and Netflix recommendations have brought more widespread awareness of algorithms, and how they learn from our behaviours. You may have experienced eerily relevant advertisements pop-up on your feed after googling an item (or even just talking about it). As users of data-collecting applications, our identifying information and behaviours (i.e., data) serve as the input for algorithms, a set of instructions, to perform tasks like generating personalized recommendations or targeted advertisements.

Algorithms are silently automating previously manual decision-making processes that have allowed for scalability and optimization of tasks. However, when algorithms are built upon biased data, the byproduct of discriminatory decision-making processes, they inherit that bias. In benign applications like content recommendations, the impact of a biased algorithm is trivial* — perhaps Fuller House would be recommended when you actually wanted to watch Beyoncé: The Formation World Tour for the tenth time. In more pervasive applications the impact, however, can be fatal. This blog post serves to signify the importance of data literacy in the Black community and highlight how unregulated, biased algorithms may amplify…

--

--

Álise Otilia Ramírez (they/she)
The Startup

Experienced Data Professional, Blogger & Activist providing advice for transitioning into data careers