Machines have no prejudices or biases
Tools — 3, Tools That Think for US

In fact machines can have prejudices and often have biases. Biases are not merely a human failing; they are unavoidable for “epistomological” reasons: they necessarily arise from the limitations of our knowledge.

For example, consider a machine-learning program whose job is to choose the salary of a potential hire based on their estimated value to the company. In effect, the computer is asked to predict the candidate’s future, which in turn controls the candidate’s salary.

The algorithm is not told the person’s gender (so as to avoid gender bias), but virtually all other information about the candidate is available (such as the work they have done in the past, their work history, stats about their family, and maybe even a history of their purchases at certain stores) as well as a massive database of other people along with their histories and their contributions to their companies.

The software observes that the candidate is unmarried, has one child, and has a history of buying tampons. It detects a significant correlation with these facts and going “on leave” (most likely maternity leave, not that it matters to the computer) in the future, which would have a substantial opportunity cost that would lower the candidate’s value to the company. Similarly, it detects that the candidate might be less available to work overtime (compared to the average) and might use more sick days, as well as other factors that most humans wouldn’t think of. Given these factors it decreases the salary offered by 4% (or whatever). It does all this without ever knowing or caring that the candidate is a single mother struggling to find a reasonable work-life balance.

(Google seems to have people working on the discrimination problem, though I suspect the method described in the linked paper might not help with the scenario I just described: it says “parity would not allow the ideal predictor Ŷ = Y, which can hardly be considered discriminatory”. Along this line of thinking, if we could know for certain that a woman will have maternity leave, and knowing that maternity leave is often much longer than paternity leave, it is more costly for the company and thus not “discriminatory” to offer less to a female that will have a baby in the future than a male.)

Show your support

Clapping shows how much you appreciated David Piepgrass’s story.