Why Do Machine Learning Algorithms Exhibit Bias?

In the “fastest route” example, this [bias] could occur if, for instance, the algorithmic system does not update bus or train schedules regularly. Even if the system works perfectly in other respects, the resulting directions could again discourage use of public transportation and disadvantage those who have no viable alternatives, such as many lower-income commuters and residents. (White House Report p. 8)

The individual records that a company maintains about a person might have serious mistakes, the records of the entire protected class of which this person is a member might also have similar mistakes at a higher rate than other groups, and the entire set of records may fail to reflect members of protected classes in accurate proportion to others ( Barocas & Selbst p. 684)

In particular, systematic differences in smartphone ownership will very likely result in the underreporting of road problems in the poorer communities where protected groups disproportionately congregate. If the city were to rely on this data to determine where it should direct its resources, it would only further underserve these communities. (Barocas & Selbst p. 685)

Professor Latanya Sweeney discovered in a study that found that Google queries for black- sounding names were more likely to return contextual (i.e., key-word triggered) advertisements for arrest records than those for white-sounding names. (Barocas & Selbst p. 682 -683)

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store