Introducing neuroscience games for fair and bias-free recruiting

Did those hours of World of Warcraft really go to waste?

Leon Wang
StartupReview
4 min readAug 5, 2018

--

Depending on who you ask, the current recruitment process is at different levels of broken. Extending an idea from a previous article, automation might be the solution.

Today I will be reviewing Pymetrics, a New York based startup that uses neuroscience games and bias-free algorithms to recommend the best candidates for a job opening.

Pymetrics‘ games for behavioral evaluation. Image from Google.

The Startup

Founded in 2013 by Frida Polli and Julie Yoo, Pymetrics is a talent matching platform that assesses candidates based on potential rather than connections, pedigree, or resume. It has been able to leverage it’s evaluation platform to cut costs and improve talent search for its client and partnering companies.

As of 2018, Pymetrics employs around 50–100 employees and has raised more than $16 Million in funding. It has partnerships with large corporations such as Unilever, Accenture, LinkedIn and Tesla.

The Technology

Through going over decades of behavioral science research, Pymetrics has put together a series of games that evaluate test-takers on 90 key cognitive, social and personality traits.

To better match candidate to company, successful current employees of the client company first play the games to determine priority traits and characteristics. Pymetrics then builds a custom algorithm for that particular client that best predicts success. Finally, potential candidates are invited to play the game and the matching process begins.

De-biasing

One of Pymetrics key features is de-biasing the matching algorithm. Gender inequality in the workplace is a pervasive and well-documented issue. Pymetrics recognizes that only evaluating the success of current employees in a biased workplace could give a skewed perspective on the determinants of success.

“The use of data-driven algorithms does not automatically guarantee fairness — the conclusions drawn from a data set can only be as inclusive as the input data itself. Fairness will only be achieved through active de-biasing of the data on which the tools rest.” ~ Pymetrics white paper on breaking the glass ceiling

To combat this, Pymetrics employs “active de-biasing methods” to detect and remove any biases in the selection algorithm. In doing so, Pymetrics can recommend future best performers that also bring diversity to the workplace.

My Perspective

There are many areas of this startup I really like. For one, I am a strong believer that behavior is a much better assessment tool than self-reporting. Hell, I would even to go as far as to argue that any assessment tool is better than self-reporting.

I also like the AI-driven aspect of matching candidates- on how the algorithms can learn from successful employees to match future applicants. This really captures a variety of success drivers for a particular job and helps make the algorithm unique to every company and position. There is really no “one size fits all” in hiring.

On the de-biasing step, I am very curious on how these traditional biases are removed. One on hand, I definitely agree that learning from biased data-sets gives a biased algorithm. However, I don’t think it is a good strategy to remove this discrepancy by adding in another ‘artificial’ bias. This goes back to the decade-old affirmative action debate.

Example questions from Pymetrics games.

I hope Pymetrics bias-fixing techniques simply involves redesigning the games or algorithms to minimize preconceived advantages. For example, making the games simpler or more unorthodox to negate greater male familiarity with video games.

However, if bias-fixing involves hard penalties just for being a specific gender or race, it would just exacerbate gender discrimination and inequality. Fortunately, I don’t believe this is the case with Pymetrics.

“Pymetrics serves as a blind audition for job candidates. Candidates move through the platform completely anonymously, and the prediction algorithm does not use any demographic information to assess career fit.”

If you liked what you read, please follow and subscribe to Startup Reviews on Medium and LinkedIn.

--

--

Leon Wang
StartupReview

Leon is a PhD candidate at Princeton University researching cancer diagnostics and therapy