Intuition Machine
Published in

Intuition Machine

Systematic Thinking is Not General Intelligence

Photo by Alonso Reyes on Unsplash

Systematic thinking and general intelligence are two different things. We build models of general intelligence that employ systematic thinking. However, this should not imply that general intelligence is built out of systematic thinking.

Logic and probability are examples of systematic thinking. The naive approach to creating models of general intelligence is to assume that they need to be constructed of systematic methods.

So there is this tunnel vision among GOFAI folks and Bayesians that the tools that their epistemological tools are the same tools that biological brains employ. This is clearly misguided.

It is as misguided as claiming that the wheel is the most efficient mechanism for locomotion and therefore biology must be employing wheels in their mechanism. I can see why people might have believed this metaphor in the past, but we know enough biology to know it is absurd.

Science involves systematic thought processes. The great advance that science introduces is that truth is discovered through experimental evidence and not from human authority. (anti-science advocates are unaware of this distinction)

Science is truth discovered via a decentralized process.

Does general intelligence itself employ a systematic process? Perhaps it does in the sense that it has evolved to use one that is robust for its survival. But that does not imply it employs the same systematic processes that humans have invented.

The process of evolution is a systematic process in its own way. It is however not one that humans have invented. It has existed before humans have existed.

Humans cannot follow an evolutionary process. At best we can employ logic or probabilistic thinking in our heads. To compensate, we use computers with their massive computational capabilities to run an evolutionary process.

The advances in Deep Learning is a consequence of running an evolutionary process. There are attempts though to create explanations of the process in terms of Bayesian approaches, but explanations are not the same as the process itself.

Explanations are interpretations for models. They are not the models themselves. Explanations are what we humans use to explain the world in a way that is intuitive to us. It is a kind of coarse-graining of information.

Reasoning is the explanations that we conjure up to give us an intuitive understanding of what we observe.

An ‘intuitive understanding’ is driven by the conceptual model of reality that we have adopted. So the Greek’s might explain a flood as Poseidon being angry. This reasoning will be less accurate than the reasoning of someone trained in meteorology.

The odd thing is that we think of reasoning as a deductive process, so we may be inclined to assume as GOFAI has assumed that reasoning emanates from deductive sub-components. This is a mistake. (see: How Reasoning Emerges from Intuition)

Intuition or amortized inference is an emergent property of an evolutionary process. Furthermore, reasoning is an emergent property of an intuitive process. In fact, 100 years ago, C.S. Peirce sketched out the evolution of inference: intuition->deduction->abduction.

It is indeed odd that after 100 years, we still do not have good models of abduction as if it was an entirely alien idea. But it is right there before our very eyes. The systematic thought process that we call science is an abductive process.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store