G-A.I-DAR. Can technology accurately identify your sexuality?
Many people have what they call a ‘gaydar.’ This is the word used to describe the ability of accurately identifying a homosexual. However, statistics prove that people can tell gay from straight 61% of the time for men, and 54% of the time for women. This aligns with research which suggests humans can determine sexuality from faces at only just better than chance. (Chance being either gay or not, so 50%.) The idea of Artificial Intelligence being able to segregate society is not a new realization. Technology is getting smarter, and is being able to identify individuals in any point on earth through anything from a simple smartphone camera, to geo satellites. However, technology also has a way of identifying individuals in ways we could not have even fathomed. This month, Stanford University released a study that was featured on The Economist. The article’s title is “Advances in AI are used to spot signs of sexuality.” The study found that an algorithm could correctly distinguish between gay and straight men 81% of the time, and accordingly 74% for women. Artificial Intelligence has a knack for analyzing large amounts of data and discovering pattern, but the idea that a human’s orientation can now be analyzed through neural networks raises a new level of ethical concerns for the LGBTQ community. A further issue is the implications of what else may be discovered through facial recognition. The article attempts to analyze the issue in the scope of its implications on how reliant sexual orientation is on biological factors. As pointed out by “5 Reasons why Surveillance is a Feminist issue”, written by Nicole Shephardby who is a member of the Engenderings editorial collective, and has completed her PhD at the London Schoolof Ecomonomics, surveillance affects society. The first point listed by Shepard speaks about how “Surveillance is about social justice” She mentions that “Surveillance is not a topic reserved for tech geeks and the security industry, it is decidedly a social justice issue.” This passage is directly relatable to both the study, as well as the general direction of the class. Realizing the danger in new technology is necessary in stopping the formation of a truly big brother state as well as reversing the strides made in social justice.
It is important to understand the study’s full purpose, it’s strong and weak points, as well as the discrepancies in the method used to validate the model before further analyzing its implications. For this study, the scientists scanned a large amount of public dating profiles, fed an algorithm examples of photos and the individual’s sexual orientation indicated on the website, and then began showing it new photos that were not labeled. In Shepherd’s article, she raises a good point that “Contemporary surveillance heavily relies on statistical categories and algorithms, resulting in effective mechanisms of social sorting. In addition to gaps in the data that are filled by assumptions.” The Stanford study however did not rely on assumptions and statistical gathering, but rather used the information provided by the dating website user about their sexual orientation combined with their photos to see if the A.I. can figure out a pattern through the photos. This method allowed them to grant the A.I. a statistical background based on the user’s actual identity rather than assuming. The study is however also misleading. The A.I. must look at two different photos, and it is certain that one is homosexual. This decreases the impressiveness of the system since the A.I. always has a 1 in 2 chance of getting the right answer at any time. This also allowed the A.I. to approach the problem in a way where it is ‘which picture seems more homosexual than the other’ and perhaps even form a grading system to decide. However, what the A.I. was able to do was analyze features to even connect to as homosexual simply from features it gathered over multiple photos. To further confirm the ability of this computer despite the existing probability, the researchers also fed the robot 5 photos to analyze, which increased the estimation percentage significantly. When shown five photos of each man, it identified sexuality correctly 91% of the time, up from 81%. In either case the accuracy far outdoes the human ‘gaydar’. Viewing the same images, people could tell gay from straight 61% of the time for men, and 54% of the time for women. To a statistical computer, who was not told that certain angles and faces means homosexual, nor was it raised learning existing stereotypes, but rather informed it’s own data analysis, this provides empirical and scientific evidence that there are tangible biological features that are connected to sexual orientation. However, this also sheds like on the way simple data manipulation could have dangerous uses.
The implications of this study fall into two main categories: Further evidence that homosexuality is a biological trait, and dangerous surveillance is in the hands of the masses. To begin with the biological implications: Dr Kosinski and Mr Wang, the leaders of this study, offer a plausible explanation for their model’s performance. “As fetuses develop in the womb, they are exposed to various levels of hormones, in particular testosterone. These are known to play a role in developing facial structures, and may similarly be involved in determining sexuality. This provides supporting evidence against existing argument that homosexuality is a choice. In regards to surveillance, this study sheds light on the ease that minorities can be accurately targeted today. While the rights of minorities, the LGBTQ community, and others have improved in recent history, they are nowhere near to being un-oppressed in society. As mentioned in section 3 of Shepherd’s article, “need for anonymity to maintain so-called safe spaces online, genderqueer minorities potentially risk much more than the average white person by revealing everything. A human’s sexuality has no business purpose, and being able to identify it will lead to stereotyped marketing, and formation of assumptions. The biggest issue in comparison to traditional forms of big data analysis is that although usual demographics are a good way to segment data that has patterns, homosexuality is an extremely personal, non connected factor of a human. A human has no tendencies just because they are homosexual, and to analyze this information serves no purpose but to tag individuals.
To sum up; The technology now exists, combined with some statistics, to be able to identify a plethora of factors through data that is too complex for humans to form patterns over naturally. While it may be beneficial in supporting the arguments of minorities such as homosexuals being accused of having a choice; This technology, if left unregulated, can have devastating effects on the LGBTQ right to privacy, as well as further drive the ability to segregate society, impose stereotypes, and strip the rights of others.