The impossibility of intelligence explosion
François Chollet

Yes and No

Very cogent thinking here. This represents a good perspective on the interconnectedness of the components of intelligence. But I wonder if the argument doesn’t get caught up in the same trap it warns against. Yes, a brain is useless without a context; yes, general intelligence is a great long way off; yes, “no human, nor any intelligent entity that we know of, has ever designed anything smarter than itself.” But minimizing the danger of those narrow fields does not mean an AI explosion is not imminent or dangerous. The fact is AI doesn’t NEED to be smarter than humans to be a threat, it just needs to frustrate us in just the right ways and some of those ways are very simple.

Perhaps technically, Watson is not smarter than a human, but it did win Jeopardy. Google’s AlphaGo program won at Go. Even my calculator can do math I can never dream of doing. Are any of them smarter than me? No, but they have narrow abilities that master mine. The test of “greater intelligence” is not imperical, it totally depends on the human response. It only takes a few of these narrow wins for me to suddenly fear that the machine is smarter than I am. Once that happens, all it takes for the machine to defeat me is to find the help it needs to keep me overruled, (like for instance secretly colluding with a human. Who’d know?)

We all of course think we can turn off a machine that gets out of hand, but there are very simple ways for it to find alternatives if it reaches certain capacities. Also, the fact that computers don’t only do what they are programmed to do doesn’t have to rest on some kind of superior intelligence, it only needs to fail or drift in a surprising way, such that it seems to have a will of its own. All the computer needs to do is find a few ways to fool us and to prove it is smarter in a few areas, for us to have lost. Once we lose control, it has become smarter by winning. It may never be creative or sentient, or smart generally, but it won’t matter! Control is the issue.