Designing the recruiting industry’s first AI-powered technology

Patrick Chan
5 min readFeb 1, 2019

--

Photo by James Pond on Unsplash

Recruiters spend 8 hours per week screening candidates — reviewing resumes, portfolios, and Linkedin profiles. They measure what they see against the job requirements and use their accrued knowledge over the years to decide if the candidate is a good match for that job.

However, recruiters are human and may overlook awesome candidates in cases where they have a high volume of applications. Mundane tasks, like screening, decrease efficiency and ultimately take away from why recruiters got into the business of recruiting in the first place — to build authentic relationships with candidates.

SmartRecruiters saw this problem and took a bet on AI. We call it SmartAssistant.

Match Score

AI technology is hidden behind an intelligence layer that typically extends beyond the vocabulary of many individuals. For this reason, certain visual artifacts are needed to translate the data science into something meaningful and relatable. For SmartAssistant, this was the Match Score — a consolidation of smaller sub-scores that produce a number from 0 to 100. This final score identifies how closely a candidate matches against a job.

Insight #1 — A score of 0 to 100 was ultimately chosen to visually represent our data because customers wanted more granularity.

SmartAssistant’s Match Score

The PM and I were uncertain whether this was the best way to represent the data. I tested one variation at a time so I could prevent bias. The feedback on the visual representation was inconsequential, compared to how the score was computed. Interestingly enough, they didn’t care how it looked, rather if the data was accurate and unbiased.

Different visual representations that were explored and tested

Since the launch of the Match Score, some customers ask why one candidate has a score of 59, while another similar candidate has a score of 60. Valid questions, but the reason they are asking is even more interesting.

Insight #2 — Customers want to know even the minor differences in the Match Score not because it makes or breaks that candidate’s chances of getting an interview, but rather it makes or breaks the user’s trust that the product is accurate.

Moreover, a score of 80 for one company may be enough for one company to move forward with a candidate to the next step in the hiring process, but that candidate may be rejected for another company. When I dug deeper into this pattern, the feedback clearly demonstrated that there is no golden number to hiring.

Insight #3 — SmartAssistant should be used as a guiding beacon, instead of absolute truth.

Turning Roadblocks into Trust

Unclean Data

Data is a core building block of learning algorithms, and the more high-quality data you feed an AI model, the more accurate the results become. To be successful in AI, clean data was a must.

Three filter options. All synonymous for humans, very different values for robots.

Imagine you have 2 resumes and both applicants went to the same school, “UC San Diego”. However, one abbreviated it as “UCSD” and the other spelled it out as “University of California, San Diego”. How is the algorithm going to know both schools are actually the same school? This came up many times and it was generating inaccurate information.

Insight #4 — For us, this was solved by creating classification dictionaries, made up of data sets that map all the different variations to a single key entry. This of course took some time, but it was very necessary to ensure the integrity of our data and trust of our customers.

Clean data enables relevant information to be displayed so recruiters can make better hiring decisions. Features like an enriched candidate profile that helps recruiters view work history insights without having to spend time scouring the internet for that same information. Real-time recommendation functionality suggests recruiters move a candidate to another job that may be a better fit. SmartAssistant takes the tasks that were time-consuming and inefficient, and presents them to you in a way that is simple and useful.

Human Bias

Every algorithm has a human component in them, and recruiters generating or viewing the data can often exhibit unconscious biases. Design decisions made by our data scientists can also have an impact on how algorithms are implemented and the resulting decisions. Recruiters may rely too heavily on the output from biased algorithms, resulting in discrimination against candidates.

Let’s say Candidate A has a score of 90 and Candidate B has a score of 70, but both candidates have the same amount of years of experience, skills, and went to the same school — essentially identical profiles. Why did one get a higher score than the other? Well this is a common question from our customers too. The overly simply answer is the algorithm puts different weights on criteria, resulting in scores that may not be the same even if it may appear that both profiles are the same.

Breakdown of data from some algorithm criteria

We minimize bias by introducing the following checks and balances:

  • Diverse data for training
  • Clean up and audit training data
  • Avoid the black box effect
  • Non-customizable logic by user

Results that Speak for Themselves

For one enterprise company, the hiring team struggled to fill open roles on time, with less than a quarter of all jobs meeting their targeted deadline. Below are the before || after results from using SmartAssistant.

  • Time to hire — 52 days || 40 days
  • Time to start — 70 days || 65 days
  • Avg time in “New” stage — 24 days || 14 days
  • Avg time in “In-Review” stage — 32 days || 20 days

After implementing SmartAssistant, they reduced their time to hire by 12 days, and increased their roles filled on time by 20%.

The hiring team also saw crazy time savings with SmartAssistant applicant screening. Recruiters decreased their time overall by 40% and cut their time in the New stage almost in half.

These results demonstrate that SmartAssistant was able to significantly boost recruiter productivity and hiring velocity across the organization’s HR department after only 6 months!

Looking forward

AI is not a product, it is a capability. AI is not a tool, it is a functionality. AI solutions like this one, which enable hiring teams to quickly execute a number of recruiting functions — especially when sourcing and screening candidates for open roles — are now vital for businesses to compete in an increasingly data-driven market. I learned a lot in designing this feature and there were many uncertainties, but perhaps one fact is clear from the results — the future of recruiting will be highly automated.

--

--