“No girls allowed” in your school’s software?
Margaret Burnett, Ph.D / Distinguished Professor of Computer Science, Oregon State University; ACM Fellow; NCWIT AA Advisory Board
Some students don’t seem to “get” computers as well as others. Unconscious biases in software design are sometimes to blame.
How does biased software affect women and girls?
Students of different genders often use software differently, but most software is made by people who identify as men. Unconsciously, men often make software that tends to be “cognitively optimized” for other men. This leaves a lot of people out.
We can think about this happening in other contexts, too. When right-handed students throw with their right hand, they tend to throw farther with less effort. If we force them to throw with their left hand, we’re disadvantaging them.
Similarly, when a student problem-solves in one way, and we force them to problem-solve in another way, we’re disadvanging them. For software, the “them” being disadvantaged is often women and girls.
What kind of differences are there in how people use software?
In our Gender-Inclusiveness Magnifier (GenderMag) research at Oregon State University, we’ve identified five different cognitive styles people bring to their use of software. One of those differences in style is how people process information. For example, when trying to figure out how to accomplish a task in software, we’ve observed that men often prefer to choose the first promising option: processing information selectively with an incremental style. In contrast, in our studies women often prefer to gather information about many options before choosing: processing information comprehensively with a “burst-y” style.
Software that doesn’t support both of these styles is biased. One great example of avoiding this bias is in Gidget (helpgidget.org). Gidget is (free) software for teaching computer programming to young people. We’ll use it to show examples of how software can support both selective and comprehensive information processing:
- Support for selective information processing: Students are free to gather a tiny bit of information about a programming command and immediately experiment with it. They can execute their program one step or line at a time, or execute the program to the end. If students make a mistake, they can then gather a little more information to figure out how they want to modify their program.
- Support for comprehensive information processing: Students can learn about programming commands before executing them. They can expand tooltips containing command documentation and pin multiple tooltips to the screen. This allows students to learn about commands first as much as they want, to think about how the commands fit together before they start trying to do it, and to compare one command to another.
Including information processing style, here are five ways people often differ in how they use software:
- Motivations: People have different reasons for wanting to use software (e.g., for fun versus to complete a task), and this affects how they go about it.
- Computer self-efficacy: People have different levels of confidence about using software that’s new to them, and this affects what they try.
- Attitude toward risk: Some people are willing to take more risks than others in the ways they use software.
- Information processing style: People have different ways of getting the information they need when they try to solve problems with software.
- Learning by process vs. by tinkering: People have different ways of learning software they’re using for the first time.
What is being done about gender-bias in software?
At OSU, my collaborators and I use the GenderMag method to find and fix these biases. To detect who’s being left out by a particular software product, people who buy software (e.g., educators or school administrators) or create software (e.g., software companies) take the perspective of one or more personas (we show an example of an “Abby” persona in this article), and walk through the software as the persona to detect when people like that persona could have a bad software experience. And it works: for example, it entirely eliminated the gender gap in one recent software product (see our “Failures” graph).
The GenderMag Method can remove gender biases from software (we show this in one of our recent publications). In the data shown here, it reduced the number of task failures for women, reduced the number of task failures for men, and completely eliminated the gender difference in failure rates.
What can YOU do?
Here are actions you can take, depending on your role, to help bring gender-inclusive software to K-12 students in your school or district.
When we remove gender bias from software, we are making that software more inclusive to everyone. GenderMag is a freely-available resource.
Margaret Burnett is a Distinguished Professor of Computer Science at Oregon State University. She began her career in industry, where she was the first woman software developer ever hired at Procter & Gamble Ivorydale. She currently leads the team that created GenderMag, a software inspection process that uncovers gender inclusiveness issues in software from spreadsheets to programming environments. Burnett is an ACM Fellow, a member of the ACM CHI Academy, and an award-winning mentor. She also serves on the Design Advisory Council of iGiant and the Academic Alliance Advisory Board of the National Center for Women In Technology (NCWIT).