Pigeonholing Artificial Intelligence

Siddharth Singh
Culture of Energy
Published in
1 min readJul 21, 2015

By Siddharth Singh, 21st July 2015

The definitions of artificial intelligence (AI) are replete with references to human intelligence. Here are the first few definitions that show up on search: “Artificial intelligence is the branch of computer science concerned with making computers behave like humans”; “…the study of the modelling of human mental functions by computer programs” and, “an area of computer science that deals with giving machines the ability to seem like they have human intelligence”.

Cory Doctorow in Locus Online prods us to think beyond such narrow applications of intelligence. He writes,

“We don’t need artificial intelligences that think like us, after all. We have a lot of human cognition lying around, going spare — so much that we have to create listicles and other cognitive busy-work to absorb it. An AI that thinks like a human is a redundant vanity project — a thinking version of the ornithopter, a useless mechanical novelty that flies like a bird.

We need machines that don’t fly like birds. We need AI that thinks unlike humans. For example, we need AIs that can be vigilant for bomb-parts on airport X-rays. Humans literally can’t do this. If you spend all day looking for bomb-parts but finding water bottles, your brain will rewire your neurons to look for water bottles. You can’t get good at something you never do.”

He writes this in context to our fears of AI. Click here for a previous blog post on the issue.

--

--