Paradox of AI 3: Artificial god.
I heard from an interview that John Oliver did to Stephen Hawking, a very interesting story. He didn’t cite where did he take the story from, perhaps because of the effort of speaking through an eye tracking based speaking device. In this story, scientific created the artificial intelligence and asked it whether there was a god or not. The computer produced a lighting strike that welded his plug to the wall, so it couldn’t be unplugged and said: you have a god now. The story is quite terrifying. But I think that it is more terrifying that the ascension of machines will not take place necessarily in such a violent way. Imagine that in the 90’s I would have told you that there would be dictator that will obligate you to give your personal data away, or get marginalized. This process has happened already, but only that we have given our information away voluntarily. There was no need for guns, but only to make a slope for us to fall into the desired place where the heads of social networks wanted us to be. Facebook never threats you with anything. They only place a button that you will find convenient. Google offered storage space that we happily used, but it then became hard to get out from their service once everyone was using it.
In the sense of what the meaning of human presence and activity will be at times of AI; I think that it is pretty much the same as it happens in children. We will be able to learn and to play; but we will not be able to take part on important decisions. When it comes to children, it is because they are not mature enough. In the case of the future, it will be because it doesn’t make sense to make human decisions when there is a superior decision already available or granted (“If you write anything on a computer, you need to get Grammarly”) In terms of learning and playing, we will work only for the enjoyment of working, according to the theory of flow. Perhaps machines will make up imaginary worlds, in the fashion of “Kidzania”, or perhaps in the fashion of games. We don’t need to make music making robots, because we just enjoy making music and we wouldn’t like to stop doing it. This will put as in a situation of constant learning, and perhaps we will start really developing in the other directions that we neglected while developing our science and technology.