Fairness-Aware Job Matching in Online Recruitment Tools

Fernando Mourao
SEEK blog
Published in
9 min readAug 17, 2022

In this blog post, Fernando Mourao, the Responsible AI Leader in Artificial Intelligence & Platform Services (AIPS) at SEEK, Melbourne, writes about the current challenges and future directions for computational modelling of fairness in AI-based job matching.

AI robot is thinking with formulas on a blackboard
mikemacmarketing, CC BY 2.0, via Wikimedia Commons

A core task for online recruitment tools is job matching, which refers to automatically matching the most qualified job seekers to their most relevant roles, creating placement opportunities [1]. Confronted by the recent advance and popularity of these tools, society, academia, and industry have recognised their profound impact on people’s lives. Since then, companies have realised that the holy grail should be ensuring opportunities for everyone. Services are genuinely essential to society when they benefit everyone. The quality or state of being fair or producing equitable treatment (a.k.a. fairness) has since emerged as a primary requirement for recruitment tools [2].

Fairness in the hiring process is a challenging, preeminent, and abiding goal. The challenge starts by defining fairness itself properly. What is a fair hiring process? Although the general nature and importance of fairness are widely understood, the precise definition and what constitutes fair or unfair outcomes can only be defined in a given context. The most appropriate fairness definition depends on the use case, and it is often a matter of cultural context, legal requirements and ethical standards [3].

Grounded in this reality, society, academia, and the industry have been employing different approaches in the automated job matching task toward effective and fair models [4,5,6]. By contrasting these approaches, this post aims to highlight the major advances in the computational modelling of fairness in the job matching task.

We identified four main approaches commonly exploited, as depicted below. Additionally, inspired by the need to continually evolve our understanding and development of fair strategies, SEEK advocates efforts around applied research of a fifth promising approach, called Aptitude-based.

Chart of Computational modelling of fairness in job matching scenarios
Computational modelling of fairness in the main job matching approaches. Columns describe the main approaches by which a job match can be automatically computed — Degree, Experience, Merit, Skill and Aptitude. The columns are ordered from least (leftmost) to most (rightmost) fair, considering the historical biases in recruitment domains. The first row describes the rationale behind each aspect. The second row lists the main input features for an AI model to determine the match. And the last row lists some fairness concerns an AI model may suffer from. Aptitude (fifth column) is a novel and promising approach we’re proposing to recognise job seekers’ known and demonstrable competencies which should be reweighed by the social context of each job seeker and combined with indicators of motivation and interests.

For those not interested in a detailed discussion about these approaches, we’ve summarised here our perspective on this complex and ever-changing topic. At SEEK, we believe that:

  1. These approaches indicate shifts in fairness awareness in recruitment over time. The community recognises that unfairness and discrimination in online recruitment tools are historically opaque, emergent and unintentional. Without active mitigation measures, they will arise in predictive hiring tools by default. Each approach represents progressive efforts toward delivering trustworthy AI solutions aligned with different understandings of fairness.
  2. The debate about fair job matching should be grounded in explicit corporate norms and values defined by executive leadership teams. It improves transparency, accountability and clarity about decisions and potential implications for business, users and society.
  3. The Skill-based approach represents a remarkable move toward fair hiring. By focusing on the practical requirements of a role, we can mitigate the impact of many social biases prominent in the other approaches while avoiding stereotyping qualified job seekers.
  4. While the Skill-based approach brings solid advances for a fair hiring process, there is room for improvement. The ability to demonstrate practical skills depends essentially on the previous opportunities that job seekers have had, which is well-known determined by biases around education and experience. As a leading company in this domain, one of our responsibilities is to promote new kinds of innovation to foster ethical values. In this sense, Aptitude-based job matching might represent a promising direction that we are particularly interested in further investigating. We aim to design AI-based methods capable of better modelling the interaction between the job seekers’ aptitude for a role with their motivation, interests and professional background weighted by the social context of each job seeker.
  5. A fair hiring process is context-dependent and might require a combination of different approaches. Rather than imposing a single direction to all its services, companies should consider which approaches best fit each service, use-case and fairness notion. Further, fairness-aware hiring imposes a thorough trade-off between effectiveness, fairness, and assertiveness. For instance, If (hypothetically) men pursue awards more aggressively than women, then men may be deemed more qualified than women for systems that recognise awards. In this case, being award-blind would have the benefit of making predictive outcomes more equitable between men and women (at a group level). On the other hand, the resulting systems will likely have a reduced ability to assess whether a job seeker is a good fit for the role, impairing their practical utility

In summary, the practical application of fairness in job matching is fluid and constantly evolving. The actual challenge is not about mathematical correctness but how to define this context-dependent human value properly and make algorithm systems support it!

The challenge for the aptitude-based approach is designing predictive models that work better than humans in determining non-subjective signals of qualification weighted by social contexts and devoid of unwanted biases

Are you interested in understanding how we came to these conclusions? The Detailed Discussion session below brings the main concepts, arguments and facts identified in our extensive research on this topic.

Detailed Discussion

The skills-based approach represents a remarkable move toward fair hiring. By focusing on the practical requirements of a role, we can mitigate the impact of many social biases prominent in the other approaches while avoiding stereotyping qualified job seekers

Degree-based hiring

It is a hiring process focused on evaluating job seekers’ educational backgrounds and degrees.

Often the reputation of the educational institutions where job seekers get their diplomas strongly influences this process. Common input features of AI-based systems anchored on this approach are related to course names, degrees, certificates, and education institution names.

The degree-based approach particularly hurts low-income adults and racial or ethnic minorities. Recently, the growing idea is that a college degree is not necessary to get a good job, nor is it a guarantee that the graduate will invariably be the best match for a role. As employers and job seekers are both reaching this conclusion, it is becoming increasingly necessary to consider qualifications beyond merely having a diploma.

Experience-based hiring

It is a hiring process grounded in the belief that experienced workers will perform better and need less training and time to get “up to speed” [10].

In this case, job matching features are usually derived from experience length, previous roles, employers and responsibilities informed by job seekers. The importance of work experience is perceived to be so great that even entry-level jobs and internships call for it in some roles.

In this case, younger professionals tend to be penalised. It is also noteworthy that experience-based hiring has proven to be biased by gender and parental status. Female job seekers who have career gaps due to parental leaves, for instance, are also penalised in automated experience-based matching systems. Finally, it is well-known that experience is strongly determined by previous opportunities, which is inherently connected to many social and educational biases.

Merit-based hiring

It is defined as the process of selecting and appointing employees based on demonstrated talent, effort, abilities and achievements.

The definition of merit itself is usually ambiguous and difficult to measure, which opens up space for misuse in recruitment. Cultural contexts might also change the practical interpretation of this concept.

For instance, in many countries, companies use the equivalent of ‘merit-based’ hiring to indicate they value more people willing to work after hours, which usually is biased towards men and/or people who are not caregivers. In practice, it might represent skills, behaviours, work performance indicators, and career achievements (e.g., awards, badges, certificates, and other measurements of effort).

Regarding fairness, the merit-based approach tends to value the experience gained through previous employment or other activities, which might reinforce biases towards people with access to opportunities. Indeed, the belief of achieving career mobility upward through job seekers’ own merits regardless of their social position, race, or gender is deemed a myth by many researchers [9].

Historically, the aspirational quest for merit has been reinforcing disproportionate advantages of social elites against marginalised groups in labour markets. For instance, in politics and government, merit-based hiring has always been a legal requirement [8]. However, many more males than females have been “successful” based only on merit. As merit is linked to opportunities in the past, males have had many more of them than females.

Skill-based hiring

It is a hiring process focused on identifying relevant competencies that job seekers demonstrate possessing.

Having a skill implies some level of competence in that skill. Although “skills” can be interpreted more broadly as personal attitudes, abilities, and technical skills, it is usually more specific and measurable than merit [7]. Cultural and contextual factors also influence this definition. For example, in Australia, skills are mostly seen as characteristics primarily necessary for the performance of a job and are mostly developed or emphasised through education.

In practice, AI systems designed for skill-based hiring consider skills as a set of well-known personal attributes (a.k.a. soft skills) and technical and education-instilled characteristics (a.k.a. hard skills) frequently related to each role.

In terms of fairness, this is another approach strongly biased towards people with access to professional or educational opportunities. The premise here is that job seekers are equally capable of developing and demonstrating practical skills, disregarding their social context. Finally, it is worth mentioning that skills should be carefully selected to avoid stereotyping job seekers. For instance, it is well-known that the set of soft skills commonly related to leadership roles are male-biased.

Aptitude-based hiring

We define it as the process of selecting job seekers based on estimated capability and commitment to perform or learn a task.

To the best of our knowledge, this is the first time aptitude-based hiring has been defined for job matching. It is based on the premise that job seekers’ known and demonstrable competencies should be reweighed by the social context of each job seeker and combined with indicators of motivation and interests.

The aptitude to fulfil a role is directly related to at least two main factors. First is the practical capability to perform or learn required tasks quickly. To measure this capability, computational models are usually grounded on skills, which are biased towards people with access to opportunities.

Due to social inequalities, not all job seekers have had the same opportunities to demonstrate their actual competencies. There is an information issue here where we are less certain about the aptitude of some job seekers. For this reason, we still need to design novel predictive models able to expand and reweigh the set of skills related to each job seekers to reflect the actual limitations of each one.

The second main factor is the commitment to play a role. Personal motivation and interests are strong commitment signals we could explore in this case. However, current job matching models lack quantitative signals for the motivation and interests of job seekers in a given role. Usually, this analysis is leveraged by humans in control of the remaining steps of the hiring process, who often introduce cognitive or historical biases in this judgement. Hence, the challenge for the aptitude-based approach is designing predictive models that work better than humans in determining non-subjective signals of qualification weighted by social contexts and devoid of unwanted biases.

We believe it is a promising research direction for recruitment domains. If we limit the hiring process to consider demonstrable skills and achievements as indicators of professional qualification, we will end up showing the most privileged job seekers, who are typically white-middle-aged males for certain roles.

References

  1. Lin, Yiou, et al. “Machine learned resume-job matching solution.” arXiv preprint arXiv:1607.07657 (2016).
  2. Schumann, Candice, et al. “We need fairness and explainability in algorithmic hiring.” International Conference on Autonomous Agents and Multi-Agent Systems (AAMAS). 2020.
  3. L. McCalman, et al.,“Assessing AI Fairness in Finance” in Computer, vol. 55, no. 01, pp. 94–97, 2022.
  4. Elbassuoni, Shady, Sihem Amer-Yahia, and Ahmad Ghizzawi. “Fairness of scoring in online job marketplaces.” ACM Transactions on Data Science 1.4 (2020): 1–30.
  5. Holstein, Kenneth, et al. “Improving fairness in machine learning systems: What do industry practitioners need?.” Proceedings of the 2019 CHI conference on human factors in computing systems. 2019.
  6. Mujtaba, Dena F., and Nihar R. Mahapatra. “Ethical considerations in AI-based recruitment.” 2019 IEEE International Symposium on Technology and Society (ISTAS). IEEE, 2019.
  7. Tom Vander Ark. “The Rise of Skills-Based Hiring And What It Means For Education.” Forbes. 2021. Available at: https://www.forbes.com/sites/tomvanderark/2021/06/29/the-rise-of-skills-based-hiring-and-what-it-means-for-education/ (Accessed: 8 August 2022)
  8. “Merit system” (2022) Wikipedia. Available at; https://en.wikipedia.org/wiki/Merit_system (Accessed: 8 August 2022)
  9. Schoenberger, C. R. (2022). When Discrimination Meets Meritocracy. Stanford Social Innovation Review, 20(3), 66. https://doi.org/10.48558/B3J3-2J59
  10. Birkelund, Gunn Elisabeth, et al. “Experience, stereotypes and discrimination. Employers’ reflections on their hiring behavior.” European Societies 22.4 (2020): 503–524.

--

--