Decoding gender bias in the job market

Thiemo Bubel
the-stepstone-group-tech-blog

--

What is gender bias and why is it important for the job market?

Research has found that women and men react differently to language in job postings. This seems to be related to the sense of belonging that is created once a candidate reads a listing. Women tend to use more communal words to describe themselves, while men use rather skill-, agency- and leadership-oriented words when they describe themselves. This leads to a gap between how positions are described and how well the male or female candidates perceive their fit to this given position.

Obvious examples of this are gendered job titles: Fireman vs. Firefighter (or German: Geschäftsführer vs. Geschäftsführerin). Other examples are less obvious, for example in the phrasing of requirements. It has been found that women will be more likely to apply to a position that focuses on an activity, rather than personality traits (e.g. “you will take responsibility” vs. “you are a leader”). There are also groups of words, whose usage is more associated with male or female traits based on the effect mentioned above. Thus, using more male-coded words will potentially discourage female candidates from applying to a job posting. Research suggests on the other hand that male candidates react indifferently to a more frequented use of female-coded words.

Overall using gender coded language can create a bias and therefore implicitly exclude applications from certain genders. Eliminating male bias from listings can make the application process more inclusive and potentially increase the number of applications our customers receive from female candidates.

Do we have a problem to fix?

Yes. A first analysis, based on 100 biased terms and 2 years’ worth of listings on StepStone, has revealed that every listing online has on average 5 coded words. The coding is skewed towards male-coding, so that ~11% of listings are strongly male-coded, while only ~6% of listings are strongly female-coded.

What have we achieved at StepStone so far?

We have built an API, which takes a German text input and analyzes it for bias and outputs which biased words were found, their bias and metrics for an overall analysis of the text.

For the English language Totaljobs has released a Gender Bias Decoder on a standalone page back in 2017. For information on that, please refer to https://www.totaljobs.com/insidejob/gender-bias-decoder/

How does the Gender Bias Decoder work?

The Gender Bias Decoder is based on a German dictionary of 50 male-coded and 50 female-coded words, created by Anna Mathes from the Linguistic Services team on the basis of the English-focused research mentioned in the beginning of this post. These are some examples of coded words:

Based on the most frequent usages of male-coded terms we identified biased phrases, e.g. “Ihre Leidenschaft für”, “unabhängige Arbeitsweise” or “analytische Fähigkeiten”. These were then taken up by the linguists to find grammatically matching alternative phrasings, like “Ihre Begeisterung für”.

These two components create our “bias dictionary” which the Gender Bias Decoder uses to search through the input text to generate the output shown above. With this small dictionary we can already suggest at least one improvement for ~63% of all listings.

Our Results?

So we have conducted in-depth analyses, as well as an internal test on gender-biased phrases of German listings and applications. Both suggest we can increase the number of female applicants by eliminating male-coding in listing text.

Our core finding is that the German platform suffers from bias stemming from the listings’ content, but also from our candidate base. This amplifies the issue of recruiters not receiving applications balanced between male and female candidates. By reducing bias in listings more female candidates could be motivated to apply to these listings.

This is what we found:

  • Overall, there are more male-biased listings on the platform
  • Female users apply more frequently on listings with less male-bias
  • Our internal A/B-test supports these results and showed that decoding bias can potentially increase the share of female applications received on a listing

If you’d like to know more details, just keep reading!

Analyzing historical listings and candidate application behavior

To understand the impact of gender bias in the past we looked at historical data from two perspectives: gender bias on listings and effect of candidate gender on application behavior.

Listings and bias

Overall, we have more male-biased listings on the platform, however disciplines do not show equal biases. See this list below for some examples:

Some rather interesting examples of biased terms were:

  • Überdurchschnittlich (outstanding): 17% of Law listings vs. 5% of all other listings use this term
  • Durchsetzungsstark (persistent): mostly used in Construction, Management & Production

Candidates and biased listings

Upon analyzing internal data, we found that only about 37% of our users are female while 63% are male, which is in line with results from previous research in the company. Side note: we identified gender from explicit entries and inferred the gender for another part of users from their first name. Altogether gender information was available for 40% of logged-in users, equating to around 750.000 users with known genders between Jan. 2019 and October 2020.

But although there seems to be a certain bias in our listing base, we found that there is no inherent bias to the types of jobs men and women are looking at. This means that generally women are interested in positions, which show more male-coding. However, after the listings are viewed, female candidates apply less to listings with strong male bias. This is where the Gender Bias Decoder can support recruiters to not lose candidates in the journey on our site.

A/B test with German listings

We worked together with the local HR team to identify a total of eight listings to be rewritten from different disciplines (e.g. Inhouse Consultant, System Administrator, Project Manager, CS Consultant, Sales Support). These listings were decoded with the Bias Decoder API and received suggestions for male-coded words to be replaced. We also performed more replacements than the API suggested, as this was possible by knowing the broader context of the full listings.

A few German example are:

Overall, we improved the listings from an average bias balance score from -2 to +2. This equates to removing around 4 male-coded terms per listing with no male-coded words in any of the new listings. The two versions of listings were online at the same time and visible to users equally for about 8 weeks.

Read more about the technologies we use or take an inside look at our organisation & processes.
Interested in working at StepStone? Check out our careers page.

--

--