This open-source tool shines a light on gender bias

by Daniela Saderi | A spotlight on Reading for Gender Bias, a 2018 Global Sprint project

Mozilla Open Leaders
Read, Write, Participate
4 min readMay 4, 2018

--

Mollie Marr, @MollieMarr, is a MD/PhD student at Oregon Health & Science University in Portland OR, in the Department of Behavioral Neuroscience.

I have known Mollie for a while now, and I am always impressed with her energy and desire to bring change where she sees change is needed. She is a true force for good! When Mollie came to me asking for advice about how to develop a “gender bias reader” for recommendation letters, I immediately thought, what a great project to bring to the Mozilla Open Leadership Program!

I was not surprised to learn that she was selected to join the current round of Mozilla Open Leaders with her project “Reading for Gender Bias”.

Gender bias (as any other kind of bias imposed by our societal structure and personal experiences) is real and it affects people’s lives from the very early stages of their education, career paths, and job opportunities. An elementary school teacher’s comment on how boys are more suited to play soccer or girls are less naturally propensed to mathematics, can shape a kid’s experience in a way that will affect how they make choices throughout their lives. These biases are so rooted in our culture that even if we know they are there, they are hard to see and shake off.

Mollie saw an opportunity to help create a tool that can highlight these sneaky biases in letter of recommendations, arguably one of the most influential pieces of a candidate’s evaluation for college admissions, promotions, or any job selection.

I interviewed Mollie to learn more about Reading for Gender Bias and how you can contribute to making this project even more awesome at the Global Sprint 2018.

What is “Reading for Gender Bias”

“Reading for Gender Bias” is a web-based text analysis tool that scans and reveals language bias associated with evaluations and letters of recommendation written for trainees and applicants. It provides a summary of feedback to authors on how they can remove bias from their letters. A friend described it as “auto-correct for gender bias.”

Why did you start “Reading for Gender Bias”?

I heard a lecture at Oregon Health & Science University (OHSU) by Dr. Esther Choo on “Understanding Gender Pay Gap.” She described how letters of recommendation and evaluations written for women differ in key ways from those written for men, and how those differences negatively impact women in everything from grading to hiring and promotion. She challenged us to think of possible solutions. I went home that night fired up and started searching the literature. I found studies across disciplines and professions that consistently showed the differences Dr. Choo described. I realized that the differences were not only well studied and established, but could be used as “rules” in logic statements. A web-based text analysis tool seemed like the next step to tackle this bias. I found some great existing bias calculators and am hoping to extend these tools by incorporating rules or signals for as many of the existing differences as possible.

How do you hope to impact the broader conversation on implicit gender bias with this tool?

What amazed me about the research on gender bias is that everyone is guilty of it — it is truly implicit and unconscious. Women are as likely to write weaker letters for other women as their men colleagues. The goal of this projects is to create an easy-to-use tool that provides direct feedback to authors of evaluations and letters of recommendation. My hope is that by providing feedback and, in turn, making authors aware of potential implicit bias, they will strengthen their written evaluations and letters ultimately making the process fairer for everyone. I hope that this tool will spark conversations about all types of bias and how we can intervene to prevent our own bias from influencing our decision-making.

What challenges have you faced working on this project?

Creating the prototype was tough. I wanted something that would be easily accessible by users, but also flexible and adaptable so that new rules and features could be added. I am not a programmer, so the learning curve was steep!

What kind of skills do I need to help you?

Anyone can help! If I can do this, you can do this!! I need help coding the “rules” that will be used to search user text for bias and return feedback (I’ve divided the rules into separate issues), designing the website to accept user input and return feedback in summary form, and documentation.

How can others join your project at #mozsprint 2018?

I have a readme and roadmap to introduce everyone to the project. I’ve also created issues on GitHub related to different types of bias in written evaluations. Anyone can work on one of the issues. There are lots of ways to join, see the Contributors’ guidelines to get started.

What meme or gif best represents your project?

Join us wherever you are May 10–11 at Mozilla’s Global Sprint to work on many amazing open projects! Join a diverse network of scientists, educators, artists, engineers and others in person and online to hack and build projects for a health Internet. Register today

This post by Daniela Saderi is licensed under a Creative Commons Attribution 4.0 International License.

--

--

Mozilla Open Leaders
Read, Write, Participate

A cohort of Open Leaders fueling the #internethealth movement through mentorship & training on working open. Work Open, Lead Open #WOLO mzl.la/openleaders