Computer Algorithms : Learning to Discriminate
This week I read the article “Can an algorithm be racist? Spotting systemic oppression in the age of Google” by Luke Dormehl (https://www.digitaltrends.com/cool-tech/algorithms-of-oppression-racist/). This article discusses the work Dr. Safiya Umoja Noble and her book “Algorthims of Oppression”, which acts in a way as a techonologically focused update for Langdon Winner’s “Do Artifacts Have Politics?” which we read for class this week. Noble argues that the algorithms that run the digital world, particularly Google’s recommendation engine, encode the world with a structure that maintains the marginalization of minorities, or “quite literally a part of systemic racism”. Noble found this very apparent when searching terms on google, where when she searched terms such as “black girls” or “asian girls”, she only received pornographic results even though her search terms didn’t include anything close to such. Furthermore, African-Americans have reported seeing adds about being arrested, ads which are reportedly not seen by white users making similar searches. There are even cases where users would download a gay dating app that provided the location of other gay men, and then they whould be suggested by google a sex-offender location tracking app. The mechanisms suggesting these search results and providing these ads are automatic algorithms, so it seems to be the case that this racial discrimination is not the work of a programmer at google, but the result of an automatic algorithm learning human discriminatory culture and what users of google associate with different races and cultures. Furthermore, this article brings up the question of google’s responsibility in relation to this algorithmic discrimination. Noble argues that, since these technological companies have a huge impact on society and societal communities, and therefore could cause great harm, that they should be held responsible for any damage done, such as how oil companies are held responsible for environmental damage. Noble tells of her hope that people who develop technology, along with technical skills, also have “deep education on societal issues”, as they embed their own worldviews and values into the technology they are developing. Thus, the question remains on how tech companies should adjust their algorithms to react to these situations, or if they are even qualified to make judgements on what is good and what is wrong. I think that these are good questions to ask, but I think there is really no good answer to such questions. At first glance, it seems to me that the correct things for these algorithms to do is to always provide the results of their calculations, despite what it may be, as this is what they are designed to mathematically do based opn equations. But also, I think if analgorithm produces some very undesirable result, such as google’s racist search results and ads, that the algorithm should be adjusted or results filtered in response, although not to the extent as to undermine the entire algorithm and its mathmatical basis.
This article touches upon many of the same topics we discussed in class, as well as in the articles “U.S. Operating Systems at MidCentury” by Tara McPherson, as well as “Society, nature and the concept of technology” by Tim Ingold. All these reading discuss the formulation of technology, as in the formulation of sciences as we discussed earlier, is not some un-human process separated from the worldviews and political situations of its era. Rather, embedded within these technological advancements are, whether they are conscious or unconscious efforts, structures that inherit these societal attributes. The discussion of the development of UNIX in Tim Ingold’s article in particular offers some valuable background to this phenomenon. Ingold’s article discusses how many of the fundamental principles of computer programming and organization, which are sure to be in extensive use in google’s algorithms, where developed with a modular and separate/segregated design similar to the segregation of African-Americans in America during the same time. These parallels show how, even subconsciously, people encode outside human factors, and work under principle influenced by these human and political factoes, into the development of technology.