How AI can make us Shallow and Biased

Vlad Rasskazov
BreakThrough
Published in
3 min readJan 6, 2018

Artificial intelligence (AI) is meant to reduce cognitive load, making it easier for us to make decisions and learn new things.

AI may significantly improve our productivity, reducing the total amount of mental effort required to complete tasks. However, at the same time, relying too much on AI in the decision-making process may bear some hidden risks.

Forgotten Rules and Omitted Information

Remember, how you set your email inbox to filter incoming letters:
- if… archive to an appropriate folder,
- if… delete,
- if… mark as a spam.

The filtering system works perfectly well, as long as you remember filtering conditions (IFs) you set. After a while, you start to forget checking folders regularly, so important information may be accidentally omitted.

  • The quality of the inherent information processing rules dictates the quality of decision making.
  • Important signals may be filtered off due to an imperfection of the information processing system.

Fed up with Biased Information

Today we all got used to getting and acting on a high-level information, as systems around us are designed to deliver only the most significant pieces. Just think about your Facebook feed, Google push-up notifications or Outlook email client.

Remember: Prior to taking in information published online always do your best to access its quality.

- This may be the search result I am looking for, but what if there is something else?
- Can I trust this information? Isn’t it biased?
- Shall I rely on the information delivered in this story/video/ podcast?

You won’t eat a rotten fruit, so why should you take in bad information?

I took this image here

Unconsious Trust to Information Services as a New Norm

Adoption of technology is accompanied by growing trust to the information system outputs. Trust to informational services and systems becomes a new norm; however, shall we actually follow this norm?

Remember the “fake news” crisis on Google and Facebook? It shocked a lot of people and reminded us that even the most trusted systems may be flawed.

The ease of use and convenience of apps turns off mental filters used to process incoming information streams. It follows that reduction of the cognitive load not only facilitates learning and decision-making process but also deteriorates our information processing ability.

Conclusion

  1. The convenience of information services builds trust and turns off mental filters.
  2. High-level information may be unintentionally (or intentionally) biased due to the complicated system design.
  3. Decisions based on trust and biased information are bad decisions.

--

--