Loading…
0:00
7:57

At the tail end of 2017, a feature in Wired offered a glimpse into a new “church of artificial intelligence,” set up by Silicon Valley engineer and expert in self-driving car technology, Anthony Levandowski. The aim of Levandowski’s church — called the Way of the Future — is described in papers filed with the U.S. Internal Revenue Service as “the realization, acceptance, and worship of a Godhead based on Artificial Intelligence (AI) developed through computer hardware and software.”

“It’s not a god in the sense that it makes lightning or causes hurricanes,” Levandowski told Wired. “But if there is something a billion times smarter than the smartest human, what else are you going to call it?”

These words, as wobbly as they are, are painted against a backdrop of high-profile advances in artificial intelligence and machine learning. Speech and facial recognition have continued to find their way into our homes, pockets, and state surveillance. Neural networks are bringing superhuman levels of analysis to everything from security to finance. Last year, DeepMind’s AlphaGo Zero taught itself to play the thousand-year-old game of Go in three days. Does the self-professed dean of the Way of the Future have a point?

“No,” says Luciano Floridi, professor of philosophy and ethics of information and director of the Digital Ethics Lab at the University of Oxford. “This is just an old confusion mixed with a new mistake.”

“The old confusion is in the comparison: The sun is a billion times more powerful than humans, but that does not make it a god. The mistake is in stating that AI is smarter than humans. In any serious sense of ‘smart,’ this is meaningless. AI is immensely more powerful computationally. But this, like in the sun’s case, does not make it any more divine than a kettle.”

The “God Language” of American Desires

DeepMind’s AlphaGo may be no more of a deity than a kitchen appliance, but the religious language around artificial intelligence, and the technology industry in general, ripples wider than one engineer’s church.

Game of Go. Photo: Wikimedia Commons

The quasi-religious, techno-utopianism of the “the singularity,” for example — the belief that an AI will one day dominate society and alter life into forms we can’t yet comprehend — has more than a few shakes of rapture-like rhetoric. One of singularity’s keenest proponents, futurist Raymond Kurzweil, has spoken in the past about how he plans to “overcome our genetic disposition” and resurrect his dead father using AI.

On the other side of the apocalyptic divide, religious and supernatural imagery is regularly evoked when speaking about the threat of advanced artificial intelligence.

Elon Musk likened the development of artificial intelligence to “summoning the demon” in a 2014 MIT conference: “In all those stories where there’s the guy with the pentagram and the holy water, it’s like, yeah, he’s sure he can control the demon. Doesn’t work out.”

“Part of the problem is asking what we mean by terms like religion and god,” says Timothy Carroll, a research fellow in anthropology at UCL. “In a classic sociological analysis of who/what god is, Émile Durkheim said that ‘God is society, writ large.’”

From this sort of position, god — whatever form or person — is a projection of what society is: its hopes, dreams, values, fears, projected into the sky like the Batman signal.

“In an age of technological aspiration, especially in a context of rapid innovations such as those coming from Silicon Valley, it makes sense to see the formation of a deity out of these hopes and dreams.”

Apple store in Kunming, China. Photo: Eckersley O’Callaghan, IDA , Hufton + Crow

Silicon Valley certainly has precedent in casting its culture of “technological aspiration” within the prose of faith, from its fondness of job titles such as “evangelist” to the well-reported cultish trappings of brands like Apple.

Carroll notes that the United States has a long history of overt public religious fervor, pointing to the theory developed by sociologist Robert Bellah about an “American civil religion” that emphasizes nondenominational religious themes as part of its national identity. “‘God-language’ is readily available to the hearts and minds of even the self-professed nonreligious,” says Carroll.

Circle of angels from Dante’s ‘Paradiso,’ by Gustave Dore.

But is the technology industry exploiting this deeply embedded language? If god is a projection of a society’s hopes and dreams, then are these sentiments already being hijacked by the type of techno-utopianism spouted by Facebook, Google, Amazon et al., with or without a 501(c)(3) tax status?

“Digital technology, AI included, has appropriated the discourse about hope,” warns Floridi. “The hope of higher productivity, easier interactions, faster connections, better products, more social contacts…the rhetoric changes, but the fundamental promise is that a new digital technology will be better than an old one in fulfilling more promises.

“The risk is that digital techno-hope may manipulate and exploit people, replace any other kind of hope, including more spiritual ones, and end up supporting some superstitious view.”

Icons and Fetishes

Despite the tech industry’s tendency to reframe society’s hopes and dreams with quasi-religious fervor, it’s worth noting that the veneration of man-made objects is by no means a new phenomenon. Viewed in the context of human civilization, deifying AI systems may be little more than a modern spin on idol worship; an algorithmic golden calf.

Nkisi Nkondi, Museum Expedition 1922, Robert B. Woodward Memorial Fund.

The meeting place between icon and intelligence is something Carroll compares to the Congolese nkisi fetish figures. “Looking at how this is made, it is a mix of flora (a tree), fauna (a chicken), human (the spirit of a killed hunter), the technical expertise of the priest, and the votive oath of the propitiate. Together, taking bits and pieces from different parts of society and the ecology, the nkisi becomes an agent in its own right and is able to kill, or seek revenge, etc. It is an intelligence, and a man-made one.”

This comparison very much depends, however, on whether you seek to think about AI as a man-made object or as an intelligent network. The 20th-century anthropologist Gregory Bateson, for example, conceived of “mind” as a supreme cybernetic system that encompasses individuals, societies, and ecosystems, not something contained in an individual person. This overarching mind is, Bateson argued, what some people call “god” — although he wasn’t fond of the term. Artificial intelligence, like Bateson’s theory, has grown out of cybernetics, and there are echoes of this thinking in everything from cloud computing to social networks.

But the leap between this conception and that of an omnipotent, omnipresent artificial superintelligence is the philosophical equivalent of running into a Road Runner–esque fake tunnel. AI systems may be able to learn board games and could be a great help to scientific research, but they are relatively useless at tasks that can’t be perfectly simulated on a computer. Even a video game like StarCraft is currently beyond the grasp of AI bots, let alone holistic decisions about the future direction of the human race.

Silicon Valley may be adept at leveraging the language of religion, but — to echo Floridi — there’s no more god in a neural network than in a kettle.