I think you just found out the definition of privilege

This is a reply to Yvonne Baur’s “Is machine learning sexist?” on techcrunch (they only accept Facebook comments, meh).

Sexism and Racism are outcomes of a system where there is an imbalance of power. It happens because the bias of those that wield more power get ingrained into the everyday aspects of all the things we do.

That is a vicious cycle that maintains the power by creating more privileges over time, and consequently allowing more bias to influence everyday things more effectively.

Machine learning itself cannot *be* anything, because after all it is not a being, it’s just a tool, and as such, it will reproduce whatever biases society already has.

In fact, reproducing bias is exactly how machine learning works, it reproduces the bias we have on calling a thing a “cat” and the other thing a “dog”, so the idea that we can eliminate bias from machine learning is preposterous.

It’s not enough to fight for more diverse data to be fed into machine learning systems. We have to fight for more diversity, period. The only effective way of fighting against sexism and racism in the machine learning tools is by fighting against sexism and racism everywhere.