Is AI preventing bias in recruitment — or creating it?

Artificial-intelligence tools are part of every aspect of the recruitment process. While these tools save employers time and money, they may also amplify bias in the recruitment process.

Yasmin Alameddine
Bias In AI
8 min readFeb 19, 2020

--

Games for a candidate to complete to test their skills, powered by Pymetrics | Photo from Pymetrics.com

Lu Chen, 22, was a month from completing her sophomore year at Boston University when she applied to Unilever for a research and development internship. After Chen sent in her application, she was prompted by Pymetrics, a recruitment company that uses artificial intelligence, to take an online test.

At first glance, it looked like a bunch of games.

Chen spent 30 minutes clicking circles, popping balloons, and placing blocks on other blocks.

She then saw her results, which suggested she pursue “operations, retail banking, private equity, venture capital.” Chen, who had spent years in nutritional science courses and volunteering for Boston University’s pre-dental society, was perplexed. “I don’t even know what that means,” she said.

Lu Chen, now at University of Pennsylvania’s Dental Medicine Program | Photo courtesy of Lu Chen

Before Chen could look up these career paths, she received an email from Unilever. The company had rejected her — and prohibited her from playing any more Pymetrics games, thus stopping her from being able to apply to any research- or science-related positions at Unilever.

“Unilever turned me away without even looking at my past experience,” said Chen.

Chen is now a candidate for her doctorate in dental medicine at the University of Pennsylvania. Her voice becomes hoarse with pride: “I stuck with science and research, what I wanted.”

***

Saniya More, 22, was one month from graduating at Syracuse University when she applied to internship programs at two news networks. More spent three days in her dorm room recording video responses to generic interview questions, powered by HireVue, an artificial intelligence-enabled recruitment company.

Prompts like “Tell us about your strengths and weaknesses” and “What are your long term career plans?” flashed on More’s screen.

More said it was challenging to connect with a computer, rather than an employer in person. “Finding a personal connection, or being humorous, can make you connect with the interviewer,” More says, “It’s really hard to replicate the human touch.”

Saniya More in New York City | Photo courtesy of Saniya More.

More could not glean information about the networks’ culture or company structure through the process. “I was interviewing them as well,” More said.

In the final interview with one of the news networks, it was clear the recruiter hadn’t watched her HireVue video.

More was offered internships at both companies, but ended up selecting one over the other “because they sent me an offer first.”

***

Zoe Chevalier, 21, was two months from graduating at Williams College when she applied to a summer internship at NBC. She donned her best business-casual outfit and sat in a conference room to complete NBC’s video questions. This process was powered by ConveyIQ, which is owned by artificial intelligence recruitment company Entelo.

The questions — like “Why do you want to work here?” and “Tell me about your previous job?” — were generic and not tailored to the internship, NBC, or her own experience, she recalled.

“You can redo your video answers once, so I redid them each time,” Chevalier said, laughing as she recalled how nerve-wracking it was. “It was hard to see yourself on screen. You automatically feel like you didn’t do a good job.”

What a candidate sees when interviewing for a company that uses HireVue | Photo from HireVue.com

After she finished, Chevalier said, she felt helpless about the next steps: “It felt like you were putting a note and sending it in a bottle at sea.”

“I never heard anything back from NBC,” she says. “Who knows, maybe I am still in the running?”

Chevalier graduates from Columbia University’s Graduate School of Journalism this May.

***

Chen, More and Chevalier are just three of thousands of candidates who have gone through artificial intelligence-enabled recruitment processes. AI is becoming prevalent in every part of the process: identifying, communicating with, recording, and analyzing candidate’s interviews, resumes and applications. These companies promise employers they can reduce the costs and subjectivity of the traditional recruitment process. But there are signs that AI may be exacerbating this bias.

Recruiting is already rife with risks of discrimination. Traditional methods may “result in unconscious bias against women, minorities and older workers,” according to a 2019 Harvard Business Review report. Legacy techniques, such as face-to-face interviews and sifting through resumes, can be “poor predictors of a candidate’s performance and heavily affected by a number of biases about a candidate’s ethnicity, name, and gender,” explains Michal Kosinski, assistant professor of organizational behavior at Stanford Graduate School of Business.

But employers these days often deal with a flood of resumes for a single position. LinkedIn and other sourcing platforms average 250 applicants to one role. Since this process often can’t be handled manually, recruiters may review only 10 to 20 percent of the applicant pool, according to a 2019 Harvard Business Review report. That’s where artificial intelligence becomes critical.

Interest in using artificial intelligence to mitigate bias is growing, according to a recent Cornell University study. Most employers predict a large part of their recruitment process will be automated within 10 years.

“We have been grappling with employment discrimination for decades,” said Carol Miaskoff, legal counsel of the Equal Employment Opportunity Commission, in an interview. “There always is a new ‘brave new world’ or ‘magic bullet’ approach trying to improve this.”

AI-recruitment companies tend to use a mix of assessment methods, including short-form questions, video interviews, and game play.

Short-form questions, like those used by Entelo, will prompt a candidate to answer questions that test personality, situational judgment or ability to recognize patterns. Those prompts include questions like, “Tell me about a time you changed your mind about something” and “What’s an example of something you wish you did differently?”

One of the games that tests candidate’s skills, powered by Pymetrics | Photo from Pymetrics.com

In video interviews, candidates record answers, which can be re-recorded a number of times depending on the employer’s preference. Questions are displayed in text on the screen.

Applicants’ responses are analyzed for keywords, phrases, intonation, body or face movements.

Automation can save time for employers and applicants. When Unilever implemented HireVue and Pymetrics in 2017, the time it took for a candidate to be hired dropped from four months to four weeks. And the matching process worked better, as applicants’ acceptances of offers rose 19 percent, according to a Business Insider report.

In 2017, applications to Unilever doubled in the first 90 days from 15,000 to 30,000 applicants. This allowed Unilever to hire their “most diverse class to date,” with an increase in non-white applicants.

A candidate’s socioeconomic status or geographical location can inhibit access to education, which can limit career opportunities. AI-recruitment tools, by testing for skills rather than background, can lower these barriers, explains Kosinski.

“We want to discover people’s hidden skills and help match them to opportunities,” says Guy Halfteck, the founder of KnackApp, a game-based AI-recruiting company. “The most exciting thing is that it’s not about pedigree or who you know.”

Not everyone is so confident of the benefits of AI.

“We live in an era of techno-chauvism,” says Ivana Bartoletti, co-founder of Women Leading in AI, a think tank for female leaders in artificial intelligence. “We automatically think more tech is better.”

Amazon recently scrapped a machine-learning recruiting engine after realizing that it judged female applicants more harshly than males, according to a Reuters report.

Even when attempting to handle such sensitive data points as gender or nationality, AI-recruitment companies can fall short.

Cassie Rosengren, co-founder of boutique recruiting firm, Digital Knack, recalls an AI-recruitment company using preset keywords like “African-American” and “female” to “get a diverse set of people.” She says highlighting keywords wouldn’t be a sufficient long-term solution to source and hire diverse candidates.

Other clients, like AI-recruitment companies Good&Co and Teamscope, build an “ideal candidate” model around one or two current employees. But, extrapolating from a small sample and attempting to apply it to thousands of candidates can be discriminatory, according to the Cornell University study.

Alternatively, pre-built models are built on general characteristics — like “grit” and “work ethic — and do not take into account an employer’s unique recruiting challenges or necessities.

“How we speak, emote, express things are all culturally informed,” says Jevan Hutson, the student lead at the Technology Law & Public Policy Clinic at the University of Washington School of Law.” If a candidate speaks English as a second language, the AI tool may mark them as not an ideal candidate because they do not align with the model’s characteristics.

And, AI-companies can be opaque.

“Secret scoring of individuals threatens rights that we have protected,” says Christine Bannan, consumer protection counsel at the Electronic Privacy Information Center, a non-profit research center that focuses on privacy issues. She and her colleagues filed a complaint with the Federal Trade Commission last fall, alleging HireVue uses discriminatory practices. “It can take away real opportunities from people, without a way to know the basis to the decision or appeal it,” she said.

An employer’s view when analyzing candidates for a role, powered by Entelo | Entelo.com

Hutson notes that many AI-tools do not just deny a candidate from one role, but — like Pymetrics and the KnackApp — reject a candidate from multiple job opportunities. AI companies risk replicating and amplifying this bias. “You have a tool that automates employment discrimination on a large scale,” he said.

Rosengren said that some AI-recruitment companies have been flagged by LinkedIn as “not a preferred partner” if the company tries to resell, misuse or discriminate against candidates based on LinkedIn’s data. “But then they pop back up again,” she notes.

Ultimately, it’s up to companies to interrogate the algorithms that are being used to screen candidates.

“People always say these companies use ‘black-box’ algorithms and there is no way to get a record of what happened to your result,” said Miaskoff, “but we should be able to.”

--

--