Recruitment biases and iona(Part 1 of 3)

Ankit Kashyap
hireIra
Published in
3 min readOct 21, 2018

These days we keep hearing about biases in artificial intelligence solutions and algorithms. A recent and notable incident was Amazon scrapping its internal recruiting tool that got biased against women. Amazon is one of the few companies in the world who are using AI algorithms on such a large scale. And if a technology leader like Amazon could not handle the biases, it certainly proves that AI biases are not necessarily a pure technological problem.

In one of our recent team meetings, I was asked whether Ira can help in reducing recruitment biases. Oh! I forgot to introduce Ira to you. Ira is an intelligent recruitment assistant and augment recruiters to achieve more. Using simple conversation interfaces (like chatbots, Skype etc.) Ira helps recruiters perform key hiring tasks such as engage and screen candidates, advertise the employer brand, schedule their interviews, follow-ups, onboarding etc.

As with any other AI based solution, a significant part of system’s active learning comes from the various interactions which are fed into its learning models (Still miles to go. Wanna join the hands? Shoot me a mail). As many recruitment activities and decisions are often taken under the influence of some sort of biases, the data-set used in the learning process and the resulted models are highly prone to be biased. In fact if the reinforcement is nudging up these patterns, these biases can go uglier than the real life scenarios (e.g. — Amazon’s tool showed strong bias against women).

To overcome this bias problem and keep Ira biased-free, one can think of using only “right” datasets, which are unbiased and depict the perfect situation. But in a human-centric process like recruitment, it is really hard to even think of this. There are few promising technological solutions e.g. textio — helping in writing gender bias free job descriptions. However, the cognitive tasks of recruiter are still very much out of technology reach.

Now let’s come back to the question of Ira helping in reducing bias. At Ira, we believe that these sort of behavioral AI problems can not be completely solved by only technology. We must embrace diversity in every major hiring process. Until we dilute the stereotypes like gender, ethnicity around us; it is hard to ignore our biases. Any organisation can start with small steps like:

  • Forming diverse interview panels
  • Being transparent to candidates about feedback and follow-ups (accountability and self-consciousness towards biases)
  • Making initial screening more objective and probably automatic
  • Let bots evaluate resumes (well, even if they do poor job at first, you can track and they can learn back).

Obviously these such initiatives can’t be managed on traditional ATS (application tracking system) or HR products. You need modern capabilities and state-of-art tools which can support you to transform your current recruitment processes. Having plain-old recruitment tools and processes might be causing your organisation more harm than the actual benefits.

Ira, with its cutting edge technology, customizable smart workflows and candidate engaging solutions, can solve most of your existing hiring process problems. So, if your recruitment teams are struggling with biases, bad recruitments, inefficient processes; send an email to help@hireira.com for a quick product demo (includes a customized chatbot for your organization).

See you soon in Part 2 of this series!

--

--