And given the advantages over us that even human intelligence-equivalent AGI would have, it’s pretty obvious that it would only hit human intelligence for a brief instant before racing onwards to the realm of superior-to-human intelligence.”⁵³
AI Revolution 101
Pawel Sysiak
1.2K82

And this is the existential crisis we face. As we move blindly towards this moment, we have nothing approaching a combined set of values as a species. Technological development should take a backseat in human progress while we spend all of our organic cognitive resources attempting to reconcile an entire planet’s worth of legacy worldviews. It’s possible, but all our blindspots are inadvertently colluding to keep us apart, and it’s downright terrifying when you step back and see the whole picture.

Show your support

Clapping shows how much you appreciated Cory Caplan’s story.