DRAFT! “Sex, pleasure, and diversity-friendly software”: excerpts from the article the ACM wouldn’t publish

DRAFT! FEEDBACK WELCOME! PLEASE DO NOT FORWARD!

Sex, pleasure, and diversity-friendly software was originally written as an invited contribution to the Human to Human issue of XRDS: Crossroads, the Association of Computing Machinery’s student magazine. After a series of presentations on diversity-friendly software, it seemed like an exciting opportunity to bring broaden awareness among budding computer scientists of important topics that are generally overlooked both in university courses and the industry.

Alas, things didn’t work out that way.

Overriding the objections of the student editors, and despite agreeing that the quality of the work was high and the ideas were interesting, the ACM refused to publish the article.

What’s at Issue: Sex, Stigma, and Politics in ACM Publishing (co-written with Alex Ahmed, Judeth Oden Choi, Teresa Almeida, and Kelly Ireland) puts this episode in context, and explores some of the underlying institutional and sociopolitical problems this episode highlights. This post is a companion piece, with some extended (and lightly-edited) excerpts from the original article that are especially interesting reading in light of how events played out, along with some reflections and a list of references.

Introduction

Most software today reinforces existing societal power dynamics and works best for people who are similar to the teams who created it. Given the demographics and biases of the software industry, today’s dynamics tend to leave women, gender-diverse people, people of color, disabled people, and many others out of the equation.

People and communities create software, whch in turn empowers peeople and communities

Diversity-friendly software, by contrast, is intentionally designed for a diverse user base. Many techniques for diversity-friendly software such as accessibility, gender HCI (human-computer interaction), and flexible and optional self-identification are backed by solid research and practical experience. For the most part, though, these techniques are not yet broadly practiced in the industry.

Software relating to sex and pleasure education is an interesting microcosm of the broader industry. Sex has long been a major driver for online innovation; streaming video, for example, was first introduced by a Dutch porn company in 1994. Today, sexual wellness is a huge market, and porn companies are looking to move into the area. Unsurprisingly, their initial efforts have been critiqued as “almost like a white-washing scam to justify the kind of anti-female pleasure, misogynist, distorted sexuality that often eroticizes humiliation, that’s devoid of intimacy, and at best mis-represents female pleasure”. From a software perspective, porn platforms focus on commodification and the needs of cis males.

Sites like Scarleteen, OMGYes, O.school, Make Love Not Porn, and Thurst by contrast are led by women, trans, and non-binary people, and take a much broader and more inclusive view of their audience. For example, Thurst, the first dating app for queer people of all genders, prioritizes safety and community accountability above normative dating culture. OMGYes provides “knowledge for women and partners” and takes a science-based approach. O.school’s initial alpha test covered topics including “Negotiating Consent While Living with a Mental Illness” and “Healing from Religious Shame”, “Why Pleasure Matters”. and “Sexy Safe Sex”. Software to support intimate spaces to discuss these intensely personal topics requires some very different priorities.

Reflections

In retrospect, it’s not surprising that even the brief observation the role sex has had in driving online innovation, and remark that “porn platforms focus on commodification and the needs of cis males” would to cause skittishness at the ACM. Still, without discussing topics like this there’s no way to do justice to the issue’s vision of examining “different ways in which technology impacts intimacy, gender expression and identity, sexual health and education, and interpersonal relationships.”

And at the risk of getting too meta: the ACM’s decision not to publish the article is a great example of the point made in the first paragraph.

Given the demographics and biases of the software industry, today’s dynamics tend to leave women, gender-diverse people, people of color, disabled people, and many others out of the equation.

Software embeds biases

Since software is designed, written, and tested by people, it’s scarcely surprising that the current and historical diversity challenges have embedded themselves in the software itself. Only rarely do developers intentionally insert biases into the software. Instead, it usually happens unconsciously.

One excellent example of this pattern is Web Accessibility, making web sites and applications usable for people with a diverse range of hearing, movement, sight, and cognitive ability. The original HTML specifications didn’t take accessibility into account. As a result, even though standards have evolved, accessibility is still treated as an afterthought. It requires additional expertise and effort to create web pages that support screen readers or mouseless navigation. These skills are not generally taught in undergraduate courses or coding schools, and many companies do not invest the resources to make their software accessible.

Algorithms are another important source of bias. Here’s how MIT grad student Joy Buolamwini, founder of the Algorithmic Justice League explains why facial recognition software tends to have a harder time recognizing black faces:

“Computer vision uses machine-learning techniques to do facial recognition. You create a training set with examples of faces. However, if the training sets aren’t really that diverse, any face that deviates too much from the established norm will be harder to detect.”

Another high-profile case of algorithmic bias was reported by Julia Angwin et. al.’s Pulitizer-prize nominated Machine Bias series on Pro Publica: “There’s software used across the country to predict future criminals. And it’s biased against blacks.”

Social networks provide other examples of bias. Harassment and threats of violence primarily target women and gender-diverse people — especially women and gender-diverse people of color. Twitter essentially ignored these harassment for years, and their more recent attempts to do something about it have been remarkably unsuccessful. Facebook’s moderation disproportionately penalizes activists of color.

Improving the diversity of the teams creating the software, and creating a more inclusive environment, is one approach to reducing biases in software. More diverse teams will naturally tend to consider more dimensions of diversity. If a team developing facial recognition software has Black engineers, they’re likely to notice the absence of black faces in their data set — or test the software on pictures of themselves and discover that it doesn’t work. Similarly, if a team developing social network software includes women of color activists who have been targeted by harassers, they’re likely to pay more attention up front to moderation features and other defenses against harassment, and have a better understanding of the problems they’re trying to solve.

Improving diversity and inclusion in the software industry is an important priority, but by itself it is not enough. As mentioned above, industry’s progress on this front been extremely slow. Not only that, there are so many different dimensions to diversity that any team will have gaps; and most software today is built largely from existing components — which embed biases.

A complementary approach is to look at software development techniques that focus on diversity.

Reflections

Of course, it’s not just the software that embeds biases; power structures do as well. And the biases that helped shape the ACM’s decisions in this case ripple throughout the field of computer science. The discussion of web accessibility here is a good example: it highlights a topic that’s a major gap in most undergraduate curricula, where industry standards are appallingly low, and was almost completely ignored in XRDS: Crossroads’ 2014 issue on Diversity and Computer Science.

Conclusion

More examples of startups building diversity-friendly applications include:

  • Atipica’s talent and diversity intelligence solutions take a personalized and empathetic approach using data to guide teams through traditionally difficult conversations around diversity and inclusion
  • Blendoor’s merit-based matching is hiring technology that reduces unconscious bias
  • Nametag, a platform for building relationships, takes inspiration from offline organizing tactics that work for building relationships and building trust
  • Textio’s augmented writing platform uses analytics to help teams find more qualified — and more diverse — candidates

Not so coincidentally, these companies are led by women of color, asexsual actvists, and others who have traditionally been underrepresented in the software industry.

Looking further to the future, imagine a new software stack designed by diverse teams working with diverse communities, with an explicit goal of countering different dimensions of systemic oppression. What would an intersectional feminist programming language look like? A queer programming language? How will software platforms, tools, protocols, and libraries evolve?

Diverse, Inclusive People and Communities create software that embeds diversity and in turn empowers the diverse people and communities who created it

As software “eats the world” — and increasingly defines the power vectors and distribution of wealth in our society — it’s more important than ever that we consciously design and implement it in a way that empowers everybody.

Reflections

The list at the start of this section is an example of how this article focuses on diversity and the perspectives and contributions of women, queer and trans people, people of color, and others in a technical context. This is a relatively rare and very important complement to the representational, cultural, and experiential discussions of diversity usually found in ACM publications. Conversely, the ACM’s decision not to publish this article wound up reinforcing one of the barriers to diversity in the entrepeneurial ecosystem: lack of awareness of successful role models for startup CEOs from marginalized backgrounds.

And …

The original article also had several sections which I decided not to include here. The background section provided definitions and references for topics like intersectionality, and inclusion, and was useful material in the context of a student-focused article, but there are much richer discussions of diversity in technology out there — see for example the Kapor Institute’s Leaky Tech Pipeline (summarized nicely by Jessica Guynn in USA Today) or Atlassian’s recent 2018 State of Diversity Report (and Nicole Sanchez’ One Big Takeaway).

There was also a second on how diversity-friendly software techniques applied to sex and pleasure education softare, briefly covering a robust code of conduct, pseudonymity and support for self- determination of gender pronouns, effective chat moderation, and threat modeling for harassment. There are several other posts here with the deeper discussions of this topic, so rather than include the text I’ll just link to them.

  • Diversity-friendly software at SXSW, with Shireen Mitchell, is a good starting point. As well as the video, there are links for areas including setting intention, accessibility, flexible optional self-identification, threat modeling for harassment, and algorithmic bias.
  • The techniques for Supporting diversity with a new approach to software wiki pages Tammarrian Rogers and I put together on the Open Source Bridge are a more detailed but less-annotated (and much-less-nicely-formatted) collection of links.
  • Gender HCI, Feminist HCI, and Post-Colonial Computing summarizes research in these specific areas, and includes several videos. The GenderMag resources in particular — structured cognitive walkthroughs and personas for finding gender-linked software issues — are something that almost any software engineering team can apply.
  • Transforming Tech with Diversity-friendly software has an example of threat modeling for harassment, and looks at the open-source, decentralized, ad-free, anti-fascist social network Mastodon through a diversity-friendly software lens.

Originally published at A Change Is Coming.