lucidminds
Published in

lucidminds

Illustration by Dominik Heilig

On Algorithmic Transparency: Why is it important and how can we shed light on AI black-boxes?

Biasedness Factor

Hiring is a hard and very important job for almost every company. It’s a big challenge to find new colleagues that share the same values with the organizational culture, it requires a lot of time, knowledge and resources. To overcome some parts of this problem, an algorithm within a recruitment software might be very useful. While designing such algorithm(s), designers have to make difficult decisions that affect many people’s lives directly.

  • The ethnicity of the candidates
  • Gender pay gap
  • Age and experience
  • Skills matching
  • Values matching
  • Personal, physical attributes
  • Educational background

Control over parameters, input-output relations

As candidates should have the right to know the reasoning behind their application processes, so should recruitment managers have the right to know the parameters of the algorithm, and how certain input changes affect the output. As such, they can steer their algorithm based on the circumstances as they see fit, rather than leaving the full control to the algorithm itself.

Summary: Counterfactual Explanations

In some cases opening the AI black boxes might be very difficult. What can be done in such cases? Some articles written by Sandra Wachter, Brent Mittelstadt and Chris Russell show that providing Counterfactual Explanations to inform users on what changes can be made to reach certain desirable outcomes is useful here.

Open Questions

1. How can Algorithmic Transparency take its part in the design process?

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Lucidminds AI

With Complex System Design & Analytics, we translate Discourse to Practice