3 Tips to Reduce AI Bias in Recruiting

Divercity, Inc.
The Bridge by Divercity
4 min readJan 28, 2022

We've discussed some of the ways AI can perpetuate bias in your hiring process. The question is — what can you do about it? Our world is undoubtedly moving towards more automation, and relying on algorithms in hiring processes is slowly becoming the norm. While algorithms introduce biases of their own, this doesn’t necessarily suggest that recruiters should avoid using them altogether. After all, AI-free hiring can come with challenges of its own.

We’ve all heard of the infamous 6-seconds recruiters typically spent looking through resumes during the first round of cuts. Whether this is true of all recruiters or a popularized myth is unclear, but we do know that when faced with hundreds if not thousands of resumes, recruiters can only spend so much time on a single resume. This naturally allows unconscious bias to influence their decisions. Recruiters are increasingly relying on AI to help them with the initial resume screening phases. In fact, in 2018 67% of recruiters surveyed by LinkedIn said AI was “helping them save time”. With AI-enabled hiring becoming a reality, it’s important to ensure that our algorithms are not further perpetuating biases.

1. Identify Your DEI Gaps Before Relying on AI

It’s unsurprising, that recruiters are increasingly relying on such tools that save them time. That said, if your initial pool of applicants is largely homogenous you’re in trouble. Even if AI can help you parse through resumes faster, it won’t help you recruit a more diverse set of candidates on its own.

This is why it’s important to assess the gaps in your hiring process before you begin to rely on algorithms. This includes paying attention to who’s been applying to your jobs, and what the hiring rates are for various groups of candidates. In other words, if you’re relying on AI to help reduce that initial volume of applications, you need to ensure that the pool of applicants is a diverse one, and that underrepresented candidates are being given a fair shot.

If you find that there are gaps in these areas, one thing you might want to do is ensure that your job advertisements are not dissuading underrepresented candidates from applying. There’s plenty of research to show that certain language used in job descriptions tends to discourage various groups of underrepresented candidates to apply. Thankfully there are tools out there to help you analyze your job description for non-inclusive language, see for example Gender Decoder and Textio.

If you’re also relying on AI to help you advertise positions to the right candidates, this review process will help you ensure that those algorithms aren’t advertising your positions to a homogenous group of candidates.

2. Continuous Testing is Key

If you’re relying on external tools to help you in your recruiting process, it’s important to research their track record when it comes to diversity. Do they profess a commitment to diversity? How do they ensure that their algorithms are bias-free? These are important bits of research to do before you start fully relying on an external algorithm. If the provider doesn’t have answers to these questions, that’s already a bad sign. AI tools are only as good as the teams designing them. Even well-intentioned algorithms tend to inherit the biases of their designers, and those algorithms may serve to codify biases.

A professed commitment to DEI by the designers of the algorithm is a good first step, but you certainly shouldn’t stop there. Whether you’re relying on your internally-developed algorithms, or utilizing external ones, you must test your algorithms for biases before you begin to rely on them, as these algorithms tend to have consequential decisions that can truly compromise the equity and inclusivity of your hiring process.

If you’re relying on your own algorithms, a good design principle for reducing bias is to design algorithms in an auditable manner, such that bias can be easily identified and eliminated by users.

3. Revisit and Reassess Your Diversity Gaps

Once you make the leap and start relying more heavily on AI in your hiring process it’s important to consistently check back and see whether your DEI goals are being met. Is your AI-enabled hiring process encouraging more underrepresented candidates to apply? Are your rates for hiring underrepresented candidates rising? Are they falling? These are important questions to consistently ask. You may want to regularly schedule checks to ensure. Algorithms can develop new biases as they continue to learn. The fact that your algorithms showed no bias in the last quarter, doesn’t necessarily mean that this will continue.

Having human oversight over an algorithm’s decisions can help ensure decisions are consistently vetted for bias. Even in the presence of automated tools, human expertise is still essential. Amazon’s AI recruiting tool showing bias against women applicants is a great example of the dangers of fully trusting an algorithm’s decisions.

--

--

Divercity, Inc.
The Bridge by Divercity

The Bridge — A blog about Diversity, Equity, and Inclusion