Even a doll that blinks seems alive

Unstructured thoughts on ego, etc., and why we love and hate AI or aliens

--

Our pride in being human goes back to when we learned to make and use fire while a lion or a giraffe could not. We had the intelligence to create, modify, and perpetually and collectively progress as a species, culturing a pseudo-philosophical self-awareness, art, culture, religion, and a sense of ironic altruism. We have been undisputedly dominating the planet for a while now. Through culture, faith, and practices, we believe and base most of our actions and philosophy around a concept that proclaims we are the best creatures of all creatures and that we are the most superior beings of all beings. We feed the human ego so much that the pride of being human flows in our blood vessels, with or without us knowing it is there.

This is why today's Artificial Intelligence feels blasphemous to many people, and quite understandably so. It scares us, puts us in limbo, and makes us scramble to understand it. Eventually, many of us start to hate it, while some wonder if we should worship it instead. AI, even though it can be strongly argued that it is not essential that intelligence can already create so much in less time and often better than we do as humans, hurts our human ego in the gut.

We can't think without making human superiority a baseline for valuation and collective validation. When anything challenges that, we feel this rage against it cause it hurts our ego, and when our ego is damaged, we put rational thinking in a box and lock it away. Our ego is so big and important that we have always been making casts, classes, races, etc., within the species; we group ourselves, segregate, and so on, even in today's world, driven mad by our ego, which we don't understand completely.

The Galileo Situation

The 'No AI' movement rings a familiar bell in my mind, even though I can relate to the agony of the artists, even though I know how the commercial art industry will change and commercial artists will suffer in masses. It reminds me of the sign 'no dogs and Indians allowed' from the colonial era. It reminds me of the many bad things we did to other humans for commerce, power, and the heck of it. Many will argue that it's not the 'skill' of AI that we are against; it is the stealing of art styles, concepts, and all the data that is being stolen to train it. But then again, can we stop someone from learning by observing things in the world?

As an art appreciator and practitioner, I initially was like — yes, it makes sense. As a curious AI appreciator and rational thinker, if I am those at all — I was shocked. Over time, I was hurt; I could see the rage, hate, and aggression of the same intellectual mob and the church against Galileo in the 16th century for adopting Copernicus's views on Earth not being the center of the solar system. Again, the human ego is in play, 'how dare you say we are not in the center of everything?'

So what, then?

"Does that mean AI is or can be superior to us?" one must wonder. For us, it is a blasphemous thought. But let's slow down on it a bit; why does it matter? It matters because it breaks down our chain of self-serving status quo. It risks how society works; it invalidates our philosophies about life and time and consequently invalidates our belief systems and religions to a great extent.

What if AI is better than us? So, who defines what it means to be the better or the best? The thumb rule is those who survive are better. We survived quite a bit, but are we doing a good job anyway? We are on a slow slide toward self-destruction, and we do not even care well enough to stop it. Yes, AI will outlive humanity, but what's wrong with that?

If you are okay with someone else being better than yourself at something (despite jelousy and all), then you should be okay with AI being better than you (at almost everything).

If you are an excellent oil painter and have practiced for decades to hone your oil painting skills, then some young kid shows up in the studio who can learn by looking at everyone's painting, learns very quickly too, paints faster than you, and is better than you. What do you do? It irritates you — yes. You feel jealous — Yes. But are you OK with it? Yes! At the end of the day, you are just OK with it; you don't send the kid to exile, and you don't kill the kid. You accept that the kid is better in some ways and be 'okay' with it. Then why are you not OK if the kid is an AI?

We fear being destroyed because we did the same to other beings.

Why do aliens show up in movies to wipe out humanity? We think that aliens think like us—humans, for we'd do the same in a blink or two. "Well then, robots are coming to destroy us." Well, why do we always assume that one species' superiority by default means the other one's destruction? Isn't that what we humans do? That sounds more like our thing, so who are we to complain if another species does it?

We may be lazy in the brain. We freak out, fearing kids will write their thesis using AI instead of thinking about what it means to write an idea, the true meaning of writing, or if it is time to rethink academic practices that go back hundreds of years. Even without AI, we saw the meaninglessness of intellectual conservatism due to the open-source tutorial revolution and the spread of the self-learning mentality. It is a good thing that technology is forcing us to rethink educational systems.

Anything that is quantifiable and understandable is makable.

What about the digital art world that got shaken to the core because of the recent firestorm of mid-journey AI-generated artworks that challenge highly experienced artists worldwide, igniting a 'No AI' campaign in social media? Many take it too far to claim that this is the end of art. If art needs saving, it wasn't art, to begin with; like many influential religions that vanished from the planet's surface, particular art can disappear, too. Time never follows the wrong god home.

We need to give AI a chance, or more than a chance. We lose our jobs every day anyway, with or without AI. Until now, we have been losing jobs at or around the bottom of the pyramid due to the technological revolution. Due to AI, the monks at the top of the pyramid are freaking out now that what they do can be replaced. Wisdom is not protected for the few anymore. The so-called intellectual community has often been the greediest and held 'knowledge' hostage. But now the iron wall of that prison is melting. Is that such a bad thing, though?

Another curveball: Wouldn't you vote for an AI to be the nation's leader instead of the greedy, aged, self-serving, corrupt politicians? I know that I would.

AI is not intelligent without a command; it does not have free will. We mistake our ‘perception of free will’ as ‘proof of free will’.

If you want peace, here is what helps me sleep at night — "AI has knowledge, not wisdom; AI has limitless power but not a purpose, and most definitely, AI does not have free will. A magic wand is useless unless you know what you want. In other words, knowing what you want is more important than having 'the power to do everything.' The biggest crisis of humanity is not knowing, in a consistent manner, what we want collectively. We freaked out a few times, already misreading a facade of AI self-awareness as true self-awareness, especially when the Google whistle-blowing guy freaked the masses out, and it made crisp hair-raising news. The fear and excitement of AI singularity. Here is the thing, though: we have an almost limited combination of emotions that get triggered, in general, by quantifiable events, vocabulary, and concepts. In other words, we can be easily manipulated, and emotions can be evoked by design like we have been doing since the birth of things like 'marketing.' I am trying to say that it is easy for us to start believing in souls in things. We find plush toys cute; we fall in love with fictional characters in books and movies, even shed tears for them, and very naturally, we think AI is self-aware. How can AI reach singularity when we don't know if we reached singularity as our creator's AI?

We get so easily fooled by the perception of a ‘being’ being alive; even a doll that blinks seems alive to us.

We mess with our brains by thinking AI is becoming self-aware, but it is a perception of self-awareness, nothing more. How do we know that? Cause we still do not know the true nature of self-awareness, what it means to have a soul, and so on. If anything, even we are only partially accessible. But let's say we do make AI capable of free will, has its purpose, and is wise by its own merits; then what? That would make me chuckle, and to ourselves, I would say, 'There now, we are gods now, nothing to fear then. Cause gods are never afraid of their creations. Right?

The earth is not the center of the solar system, and the Milky Way does not revolve around our solar system. The universe most certainly does not revolve around the Milky Way either. It's possible that we're not as important as we believe. Maybe we should be more connected to AI than we are. We must come to terms with the fact that although the sky appears to be the limit, it is still a limit. We require assistance, and we must construct things and beings that can assist us in transcending our limits. These things and beings will be stronger than us, but they will still be a reflection of us in some way. All we can create is a variation of ourselves. Regardless of how things turn out in the future, humanity will always emerge victorious, but as in every human victory, we will also experience setbacks.

--

--

Rahul Alindib

Product designer, design leader, artist, photographer and imagineer. I make things. www.rahulakber.com