AI and gender: The ‘default male’ and the unspoken gender binary
DataKind UK’s ethics book club discussed AI and gender in September. This blog touches on some of the reflections from that discussion.
By Emma Deraze and Stef Garasto, DataKind UK Ethics Committee
In Invisible Women, Caroline Criado-Perez describes a ‘gender data gap’ that is both the result and cause of ‘a world built by and for men’. The examples of processes and products that are designed with the ‘default man’ in mind — phones that are too small for female hands, seat-belts designed for male bodies — aren’t just unfair. Criado-Perez shows that women’s lives are endangered because of such “oversights”.
Yet in many situations, there is no legal requirement to take anything other than the ‘default male’ into account, such as crash test dummies that shaped for female as well as male bodies in road safety tests. That legislation should exist — but even if it did, it’s unlikely to sufficient. Without a culture that strives for inclusion, targets would likely be circumvented or given lip service. (And why isn’t anyone following the money? Even if inclusivity required slightly higher manufacturing costs, wouldn’t the access to a larger market of grateful customers win out in the end?)
Looking specifically at the gender data gap relating to AI, the conversation often makes it way back to the square one: the leaky STEM pipeline. Female inclusion in STEM remains a series of small hurdles rather than one big one: from the lack of women in STEM (including post-doctoral positions, where R&D is often focused) to the differential acceptance rate of code written by women on GitHub (female-written code was accepted more often…only when gender was not known), to the fact that inclusive recruitment needs to be followed-up by inclusive employment policies in the years after.
Discussions of the inclusion of women — whether in test groups, review panels, academia etc — can often frame women as a minority interest group rather than half our population. Discussions need to shift from the norm of treating women as a small group that require tweaks. Truly inclusive practices require a big shift — understanding that the default is male, not gender neutral or blind. (And, if this “small” 4-billion-strong group can be hand-waved away so easily for so long, what hope do actual minorities have of ever being fairly represented?)
Not your problem?
But this is the reality of being female in traditionally male fields, groups or roles; and of course it has two dark sides, not just one. While women are routinely not considered in tech, many men have difficulty being accepted (and perhaps seeing themselves) in more stereotypically female roles or behaviours, even when it can also be a matter of life and death. An absence of female crash test dummies is fatal for women — and a cultural aversion to talking about mental health is particularly fatal for men. (Change is coming, but perhaps a little slowly. Prince Harry’s discussion of mental health has been important for young men, and not something that would have been common to his father’s generation.)
Past the binary
If Invisible Women does a lot to make (cis-)women less invisible, it does little, or nothing, for transgender (or trans) people — those whose gender is not the same as the one they were assigned at birth . Indeed, the concepts of sex, gender and gender expression appeared to be conflated, and gender is presented as a binary concept — excepting a very brief mention of gender neutral bathrooms.
Exclusion is no less harmful to trans people, as Os Keyes argues in another of our book club materials, The misgendering machines: Trans/HCI Implications of Automatic Gender Recognition. Trans experiences are not always captured by the traditional view of gender as binary, immutable and physiological. This erasure leads to “widespread explicit discrimination” and the knock-on impact of greater vulnerability to inequalities and violence.
Often, this conception of gender is taken for granted, rather than being explicitly stated — suggesting that trans people do not register at all in the minds of many researchers and authors. This in itself creates another gender data gap, as those that exist outside of the binary are forgotten.
Spot the gender
Poorly designed or implemented tech makes this situation worse. For example, automatic gender recognition (AI that assigns gender, often non-consensually) tends to ignore non-binary conceptions of gender entirely and that gender is something that can be externally assigned. These premises mean that this tech is not inclusive of trans people by design.
Moving past the binary makes things more complex — and humans and machines are alike in their distaste for complexity. But there are some potential, if partial, solutions: check — and explicitly discuss — our assumptions when we design and implement new tech, use a model of gender that includes trans and gender non-conforming people (perhaps using continuous variables for gender-based characteristics) and honestly assessing whether we absolutely need to use gender at all. This adds to the complexity of the tech — and in doing so ensures that it better reflects life.
The DataKind book club
This blog reflects some of the discussions at DataKind UK’s ethics book club session on AI and gender. The conversation highlighted that empathy comes of looking at the ways in which we share exclusion, and also the ways in which we share privilege. We hope these discussions will enable everyone to be just a little bit more compassionate. To discuss various ethical considerations related to data science, come to our next book club.
 Trans is an umbrella term and includes (but is not limited to) many different terms such as transgender, non-binary and agender — read more here.