Why Advanced AI Is an Existential Threat

Future AI might be smarter than us. Would it care about humans?

Hein de Haan
How to Build an ASI

--

Photo by FLY:D on Unsplash

Even though gorillas, elephants and many more animals are much stronger than humans are, their survival as species depends for a large part on us. The reason, of course, is our intelligence: we are smarter than all other animals, which has allowed us to build a scientifically and technologically advanced civilization — destroying natural habitats and endangering species along the way.

In a twist of irony, it’s plausible we’ll also give rise to a new “species” in the future: Artificial Superintelligence (ASI). This just means AI that is smarter than humans are in all their cognitive areas: doing math, creating rockets, having conversations, you name it. Such an ASI would be smarter — and possibly a lot smarter — than even John von Neumann, Albert Einstein and your favorite genius. Since AI has become more and more advanced over the years, organizations are poring billions into AI research and because ASI would be hugely valuable, it’s plausible some organization will create ASI at some point.

If we define intelligence as an ability to reach goals in a wide range of environments, an ASI would by definition be very good at reaching its goals. If those goals don’t include the survival of the human species, we may very well…

--

--

Hein de Haan
How to Build an ASI

As a science communicator, I approach scientific topics using paradoxes. My journey was made possible by a generous grant from MIRI (intelligence.org).