Member-only story
Your Algorithim Has a Racial Problem
Fixing systemic racism in automated systems
It’s easy to spot the racist marching with a KKK or Nazi flag. Or when a so-called influencer admits on the news that she uses the N-word all the time.
That’s all awful and should be called out, but it’s also a giant distraction.
What really impacts people is the rampant racism baked into everyday systems. Sadly, those systems include computerized, automated ones.
It’s bad enough people have to confront human bias, but even our machines are wreaking racial havoc.
Computer-based programs that determine credit scores and drive facial recognition, predictive policing, risk assessment, hiring and even electronic healthcare models are embedded with pervasive inequities and bias.
Sounds strange? Well, here are some examples of how it works.
Credit Scoring Bias
Credit scoring models typically are trained using past financial data, including payment history, credit use and loan…