Photo of the author circa 2013, taken by a coworker-friend outside our office.

Algorithms–like instruments–must be carefully tuned. How? Algorithm EQ

Ryan LaBarre
The Startup
6 min readJul 2, 2020

--

My company book club is currently reading an important one: Algorithms of Oppression. This has sparked strong interest — in myself as well as many coworkers — to take action. I encourage you to read the book as well, to learn more about the core problems inherent to private algorithms. Ideally, algorithmically-driven decisions should represent all people equitably… but that’s currently not often the case.

How Search Engines Reinforce Racism.

Just as our world evolves, this blog will function as my personal evolving account of a shared journey towards reducing algorithmic oppression. It’s a small first step, at the moment: recognizing the problem, and that there are easy ways for every human who can read this message to help, right now.

This first post lays out initial plans for counteracting algorithmic oppression — a pervasive problem in current technology, especially prevalent in search engines. Even with equality at their core, evolving algorithms quickly become complex, and have the potential to oppress certain groups or individuals unfairly. This can be prevented.

Perspective — for example, seeing digital trust and personal trust as separate yet related issues — is important to keep. Open source, repeatable digital trust…

--

--