Should we bias toward equality?

Jack Davis
Aree
Published in
2 min readMay 25, 2017

Dear internet,

I have a question for you. But first, some quick background.

My team and I have created a product, Aree, which aims to make people happier at work. And indirectly, build efficient teams for employers.

Aree discovers, rates and curates jobs for our users. And does the same for companies.

To achieve this, we use AI to rate a person against a potential job. It examines factors like culture, experience, commute time, salary etc.

Now it’s time for Aree to address equality. Here in my question lies. For the sake of the conversation I’m going to use gender (male / female only for simplicity).

Imagine we have an employer with 6 males and 3 females. I’d argue that team should hire more female staff to balance it’s gender ratio.

Lets say this hypothetical team uses Aree. Aree‘s job is to hire another team member. Aree knows their gender ratio is unbalanced. Should Aree bias toward candidates who are female? I.e. given two identical candidates of opposite gender, should we suggest the female candidate?

Factors at play:

  • Will the team be happier (and more productive) with a more balanced gender ratio? (See the government’s opinion)
  • What’s the gender ratio of the industry?
  • How might gender impact success/happiness for this particular role, if at all?
  • What if the company need to maintain a “quota” to improve gender balance?

Our simplified example extends to a wider question which can be applied beyond gender to race, socioeconomic status, appearance, sexual preference and more.

Given we can actively drive equality across the workplace, should we?

Should we bias toward equality?

Or in this example, should we make our AI blind towards gender? Which might inadvertently widen existing gender ratio disparity.

I do not know the right answer, or if one even exists. I only have more questions:

  • What if this is just political correctness gone mad?
    Why should Aree treat potentially discriminatory factors differently to “normal” factors?
  • Might we inadvertently alienate individuals? In a world where industries will stereotypically hire particular personas (e.g “builders are men”, “nurses are women”) for better or worse.
  • Our goal is to guide individuals toward jobs which suit them, and therefore make them happier and more fulfilled at work. How does equality bias balance with employee efficiency?
  • Our goal is also to provide opportunities for change. To recommend career paths which otherwise might not have been considered. How might equality bias provoke change? Positive or negative.
  • How might equality bias impact work efficiency? How might business be affected if one hires against statistical success?
  • How can we balance the incitement of social change with improvement to business productivity?
  • Could Aree break the law by considering these factors?

What are your thoughts? Drop a comment here, or message us directly on Aree

--

--