Are we getting distracted by board game winning super computers?

(Picture by the Opte Project [CC BY 2.5 (http://creativecommons.org/licenses/by/2.5)], via Wikimedia Commons)

Its the humans behind artificial intelligence, not the machines we should focus on now

Elon Musk recently called AI our “biggest existential threat”. Almost every week there are more news about robots winning one game or the other. Finally, it seems they have even mastered Go, which was seen as one of the last remits of human superiority (to the great displeasure of my little brothers, who quite liked to claim playing a game even too hard for super computers). So should we all prepare for the coming rule of the robots?

We could, but in the meantime there might be something even more urgent: Apple’s CEO Tim Cook just gave a vivid account of his potential powers today, not in the near or remote future: Building a backdoor into the I-Phone would be “the equivalent of a master key [in the physical world], capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes.”

Combine this data dump with Palantir’s software, which helps to connect the dots, spot relationships, pick out key pieces of information and put it to work under the guidance of a human brain. Someone is now pretty powerful. The US military already shows what is possible. Its analysts combine human intuition with pattern recognition techniques to identify suspects from afar and execute targeted drone killings all over the world while sitting in their US office. The killing of Al-QNasir ­al-Wuhayshi, ­al-Qaeda’s second-in-command in 2015 offered a high profile example for a so called signature strike. In contrast, Artificial Intelligence does not yet pass eighth grade science tests.

Hence, while autonomous intelligence seems an important concern for the future, maybe the most imminent issue for now is dealing with the humans controlling large sways of data and intelligent machines.
The most obvious way would be to introduce clear legislation and guiding principles, solving issues like the current one between Apple and the FBI before they arise.

But as Snowden showed, governments and large corporations are not necessarily self limiting. People have to keep the guardians of their data and algorithms in check, and currently the balance of power is out of whack. It seems a bit like after the advent of literacy, when those able to read ruled as removed philosopher kings. Only when reading and writing became widely spread, democracy developed and the majority of people could become involved in decision making.

Equally, Silicon Valley’s data and algorithm literate avant-garde can currently afford living in a remote commune in Palo Alto and making far reaching decisions without the rest of the population having too much of a say.
This will only change when more people actually understand what is happening under the hood. Obama’s plan to provide coding lessons is a first step. Yes, there might be discussions around how far his proposed $4.1bn will go, given this amounts to no more than $86 per child. But it marks a fundamental change in public policy, taking computing skills out of the geek corner into the realm of basic education, relevant to everyone.

And spreading computer literacy far and wide might not only help tackling the issue of almighty technologists. It could also go a long way in helping to turn the existential threat of autonomous intelligence into the much more positive vision of Open AI, a non-profit backed by Musk, which believes that “AI should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible.”

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.