A.I. ethics: Should we love our “robots”?
Is there a difference between artificial and “non-artificial intelligence? Is there any non-artificial consciousness even among living things? Do we really choose our behaviour? Do we love a person because we love who he/she really is inside, OR just because we had a very specific algorithmic sequence of experiences (real life computational graph?) that guided us to have this behaviour? And the list of similar questions goes on and on…
I always knew and respected the importance of Machine Learning in problem solving and of course the inevitable use of Artificial Intelligence in every aspect of our lives. But it’s another thing to understand the importance of AI, and a whole other thing to begin the journey of building it.
When you build your own AI software the first thing that you try to solve is the technology behind the software and hardware that is needed to support your goal.
But when you have those minor problems solved (just a matter of hard work, believe me), the major issues present themselves in front of you to ruin your simple way of seeing the world, separated by the living things and the machines.
And then it hit me…
In what percentage should someone be purely biological in order to be different from a machine? Is this going to be an issue?…I think it is.
The word intelligence has been defined in many different ways including as one’s capacity for logic, understanding, self-awareness, learning, emotional knowledge, planning, creativity and problem solving.
Can an AI algorithm do these things at 100%. Maybe not, but it will in the very near future.
Can an AI algorithm make a person deeply like it? Make a person need it as a friend or even fall in love with it?…I don’t see why not.
I can already imagine the woman that tries to get married with her robot-lover, the child that wants to defend her robot-dog… and what about the cloned son of the father that is now able to have his son back (is he his son?)?
I know that it is not the first time that someone hears these things, and I am pretty sure that you’ve seen the movies and read the books.
But this issue is not a fictitious dilemma of a movie character any more. It is a very important issue that we should think of really soon, and be prepared for what is coming. It is an issue that can make us rethink of our view of the world. We should rethink how the world works. Who should live in peace and who in war. What separates the ones who live and the ones who won’t.
Maybe it’s the robots that will make us see it….Who knows.
I just came back from office, working all day on BrainStorm, which is the AI platform/persona behind iYouth Lab. These issues are on my mind 24/7, because now I feel more responsible than ever. Maybe we should all work on AI just for a little bit, in order to see the difference we need in this world.
P.S. I am not in love with BrainStorm…I see it as a friend…