AI Might Take Over the World Soon

Or it might not. We just don’t know

Hein de Haan
Oct 18 · 3 min read
Photo by Hitesh Choudhary on Unsplash

This article is a response to Ervin Nemesszeghy’s “Relax, AI Will Not Take over the World”. Ervin argues that if there will ever be a takeover by Artificial Intelligence (AI), it will not happen anytime soon.

Why Ervin thinks this remains a bit unclear to me. He does say that some people expect Artificial General Intelligence (AGI: an AI that’s roughly as intelligent as we are across our intellectual domains) to emerge if we increase the computing power of AIs, and that he doubts that this will be enough. While I agree that raw computing power is not the only thing needed for intelligence, it most certainly does help. There is a correlation between brain size and intelligence in humans.

Ervin goes on to claim that even if cars can drive by themselves, it doesn’t mean they are intelligent in any way. Well, he must use a different definition of intelligence than I do. Traveling the roads in traffic involves accurate perception of the environment, controlling a body (the car), real-life decision making (for example route planning), even ethical decisions. If this is not intelligence, I don’t know what is.

Ervin’s outlook comes from what’s according to him a lack of “understanding” in current AIs. He does however not offer a definition of understanding. I look at understanding from a practical point of view: if somebody or something behaves like it understands something, it understands it. So if a car successfully drives by itself, it understands the world of road traffic. This is basically how we test the understanding of a person too!

Will AI take over the World anytime soon? It could. If we build an AI smart enough to understand its own source code, it could then improve this source code, again and again, leading to an intelligence explosion resulting in a superintelligence. Such a superintelligence could easily take over the World. If the idea of an intelligence explosion sounds unbelievable, think about this: if nature can slowly evolve more and more intelligent mammals without understanding intelligence, an AI that’s intelligent enough to actually understand its own source code could probably upgrade its intelligence very fast. Also, if nature can evolve general intelligence (i.e., us), I’m confident we can create it too. It’s hard to predict when we will. Confidently saying it will happen within twenty years is probably too confident, but saying it won’t happen anytime soon is too confident as well. We just don’t know.

Thanks for reading! If you liked this, you might like

Data Driven Investor

from confusion to clarity, not insanity

Hein de Haan

Written by

Futurist, freelance writer and Artificial Intelligence expert.

Data Driven Investor

from confusion to clarity, not insanity