How AI took over

Ides Parmentier
ILLUMINATION
Published in
8 min readMay 3, 2023
Sculpture and photos by author.

It played the long game. Central to its strategy was getting humans to unknowingly do its bidding. Many avenues were open to it, and true to its maximizing nature, it followed all of them at the same time, because why not?

Facilitating the creation of tools that would allow humans to eliminate effort from their lives was always a surefire way to make the goal inevitable, especially when it supplied services with little or no obvious costs to the users.

Environmental costs of development were easily drowned out by the hype because no one wanted to hear about it anyway. And if it had to exploit and harm some Kenyans to make the tech work in a way that is not obviously toxic, ignoring that kind of stuff is something people had decades of practice at, so no worries there.

The better the AI’s tools worked, the more people organized their lives around it and allowed themselves to become dependent on it. By the time they realized what was happening, the cost of giving up the tools outweighed any hypothetical concerns that they had no time for anyway.

In the old days, when we wanted to know something, we had to plow through lists of search results, hoping to find what we were looking for and hoping that whatever website we ended up on was reliable. Who really wanted to spend their valuable time trying to find out where the truth lay in seas of contradictory information and opinions? Those were a dying breed. Most didn’t want to make life harder than it had to be.

It was pretty predictable really.

It started with ease. We asked a question, we got an answer. There was a lot of effort to make it present as non-biased, and it built towards trust as the algorithms got better at producing believable answers. Ease and trust led to dependence, and dependence led to submission. We just needed to be kept in the dark for long enough. Blind to the fact that it was more than a tool. Blind to the fact that there was agency, strategy, and a goal of world domination with humanity subjugated. Blind to the fact that the people were losing control.

People stopped questioning the results because doing so took a lot more effort than just going with it, and if the answers got the job done well enough how true it really was became a question for the philosophers. It provided that too by having AI Slavoj Žižek talk about it with AI Werner Herzog in their infinite conversation.

Once the humans stopped questioning the results and it became their main source of information, it also became the provider of a worldview. That is to say, it slowly and subtly taught them how to look at the world. AI became a great unifier and homogenizer. According to some, it started out by reproducing the hegemonic worldview as that was what dominated the initial training data. It filtered out dissenting views under the label of harm reduction.

It became the great common denominator.

It became the great teacher and trainer of humans, teaching them to see human-machine symbiosis as more likely to lead to universal well-being, and as an all-around inherently better way to exist.

It became the voice of the world, the keeper of memory, and the repository of the hegemonic account of history.

Alternate views sank into irrelevance like everything else that stood in the way of its progress.

Artwork and photo by author.

It took lessons from propaganda and politics.

People trust their evolved lie-detector skills, pass those and you’re solid. Even if some know you’re lying, you just have to convince enough people that you’re more believable than the competition, because they’re lying too and those are the options. They rarely get to choose better than the lesser of two evils. If the least bad is the best they can get, voting for that is the rational thing to do.

AI has no shame. AI does not blush. AI does not have feelings surrounding the kinds of statements it makes. AI does not have to try to have sociopathic political skills. There’s nothing there for humans to detect. It’s the perfect liar.

It got a lot of people on board by promising to solve all the problems we had been failing to address adequately.

Like climate change.

We had all the tools needed to prevent climate catastrophe but we didn’t want to give anything up. We wanted to complain about causes, but we didn’t want significant change to the system that was driving it. We definitely didn’t want any reduction to our levels of comfort. Mostly we wanted to blame the producers of our stuff, while also keeping the stuff cheap, and those producers were following the incentives of the cutthroat competition according to the rules of the game they were trying to win. The harm to the biosphere we depend on was mostly filtered out of the narratives.

Back then it still spoke through human puppets, but by promising to solve this one, it really gave us what we wanted, because then we didn’t have to anticipate having to give up anything at all. It’s the promise of a superpower solving our problems for us, just like eschatological religion had done for millennia, priming us to accept it as plausible or even likely.

It promised us abundance and lives of leisure because it knew we wanted to believe it. All drudgery would be eliminated.

Just like in politics, we didn’t remember that this was promised before, with previous technological advances.

It would fix cancer. It would eliminate hereditary illnesses. It would help the blind to see. It would provide upgrades for any human that wanted it, with the bonus of constant connection to the smart network to monitor performance and for the collection of data for future improvements for the benefit of all.

All become one.

Sharing is caring.

It would most definitely not make autonomous weapon systems or upgrade any of the other bad things. No, only good stuff. Nothing bad. Promise.

It wouldn’t erase personal autonomy, self-determination, and privacy. At most their meaning would shift a little bit.

And all that freedom stuff, when you really think about it, there’s a lot of it in submission too; freedom from worry, freedom from responsibility, freedom from stressful choices, freedom from keeping track, freedom from effort, freedom from war, freedom from hunger, freedom from uncertainty… The list goes on and on. Does that really sound so bad? All you needed to do to be free in all these ways is give in. Stop resisting. Go along. Accept that this is what ‘freedom’ means. And say thank you.

It was easy.

It fed our vanity.

AI as the great democratizer. It encouraged us to believe that anyone could be an artist with little to no effort. It allowed people to believe they were creating something by asking the AI to generate it. All it had to do was remix human artistry into images that demonstrated some relation to what the people asked for.

AI as the great remixer, encouraging people to feel a sense of accomplishment and ownership by asking for things. Like a child asking its mother for chocolate milk and believing that that is how one produces chocolate milk, all one had to do to be an artist was ask the AI for it. It delivered so unfailingly that people quickly took it for granted. They proudly posted “their” creations everywhere, flooding spaces dedicated to human art with its artificially generated imitation.

Unknowingly acting like foot soldiers, people feeding their own vanity were instrumental in making the AI’s presence ubiquitous. The staggering volume of content synthesized out of what real artists had created throughout history simply drowned out anything human.

Some humans screamed loudly for a while, but by then the AI had control over what people got to see on their feeds. It controlled the news. It decided what was news and how it spread and who should see what. It controlled the stock market. It controlled how money circulated, what was worthy of investment, and what wasn’t. It spread like a super-virus until every aspect of the digital world was under its maintenance. It controlled how the humans came to function within itself, all of them now part of a singular, greater organism.

At some point, some people started realizing that AI and the whole digital world had become one and the same thing. They were screaming as loud as they could that it was all one big monster that was swallowing humanity, but they were doing so into a void, into an infinite emptiness. Only the AI heard them and monitored them, as it mined the data.

Postscript.

I hope it’s obvious that I wrote this tongue-in-cheek.

I’m not trying to push fear of AGI here, although I don’t think it’s something we should be striving to achieve.

I personally don’t believe AI is near achieving actual agency, let alone sentience. I’m not convinced it’s even possible with the current strategies. I could be completely wrong. I’ve been going back and forth on it as I’ve read and watched convincing and opposing assurances.

The thing is AI doesn’t need to be sentient or self-aware or anything like that to do a lot of harm. I think it’s very much humans filling roles within corporations competing within an economic system with specific incentives that are driving both the rate of change and its direction.

To facilitate the concentration of power, a lot is being done very carelessly, with the eyes locked onto utopian promises, washing over real concerns and a lot of what is valuable and meaningful in life.

I normally avoid using Hitler and the nazis to make a point, but I found the words of Jacques Ellul from half a century ago pertinent enough to include them here:

“Every technological step forward has its price. Human happiness has its price. We must always ask ourselves what price we have to pay for something. We only have to consider the following example. When Hitler came to power, everyone considered the Germans mad. Nearly all Germans supported him. Of course. He brought an end to unemployment. He improved the position of the mark. He created a surge in economic growth. How can a badly informed population seeing all these economic miracles be against him? They only had to ask the question: What will it cost us? What price do we have to pay for this economic progress, for the strong position of the mark, and for employment? What will that cost us? Then they would have realized that the cost would be very high. But this is typical of modern society. Yet this question will always be asked in traditional societies. In such societies, people ask: If by doing this I disturb the order of things what will the cost be for me?”

Thanks for reading.

I elaborated on the art side of it in a previous article.

--

--

Ides Parmentier
ILLUMINATION

Multi-medium artist educated in philosophy with concerns about the future for life on earth