Dear Amazon, your algorithm MUST be bias

  • Kill AI systems to avoid bias. It seems easy but the business will not always accept such choice due to the impact on the business growth and the disruptive innovation trends.
  • Aggressively preprocess the data. Since the data is biased by nature, we should study the data and remove any feature that contributes to the bias somehow. For example, in Amazon’s data we might remove anything that identify the gender. I believe that data cleansing and preprocessing is not as simple as that. We need to spend a good portion of time to prepare the data.
  • Tune model parameters. We may give more weight to certain parameters, such as: diversity, in the learning model. However, which one is more important, qualifications or diversity. But at the end, tuning parameters is not a very hard task to do and it could generate a publishable paper :)
  • Consider instance-based learning. Data distribution and underlying characteristics are changing rapidly, a.k.a. data shift. Thus, we need to take this rapid change into consideration. One way, is not to build a model and take all data points we have into consideration at prediction or classification time. This will mitigate the bias each time we have a new, presumably, unbiased sample. However, we might ended up biased against the other side, against male in Amazon’s case.
  • Have unlimited time, data and computational resources. While the time and computational resources can be resolved using AWS HPC capabilities, the unlimited multisource data is very unlikely to obtain and learn. Most of machine learning algorithms learn very organized and homogeneous datasets. It is not as easy to learn from unstructured and/or heterogeneous data.
  • Change the error metric. Current error metrics evaluate the learning performance using prior baseline labels against the predicted ones. If we think learning bias is not good, we need to penalize that bias. We should introduce the bias toward specific attribute(s) as undesirable posterior. But the question will remain, who should decide that? A human, who’s biased by nature, or another machine!
  • We just live with the bias. Amazon is dominated by male employees. Why they kill the system that is trying to mimic their behaviour. Why they don’t kill their bias behaviour. I believe Amazon’s HR will always be biased toward something due to the fact that they have selection criteria.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Salem Alelyani

Salem Alelyani

Center for Artificial Intelligence, Director