Member-only story

How the computer model holds back AI

since the “information” model is alien to brains

John Ball
Pat Inc
10 min readJan 6, 2025

--

This is a generated image from text “computer holding back robot.” Here, the robot peers into the computer, which is stopping its progress because of the way it works. Or is the AI tool in error and just holding it up not back?

Last time I identified the problem. I wrote:

AI is being held up by standard ideas used in computer science that are ineffective for human brain emulation. The main problem is that the representation used in computers is alien to that of a brain but is forced in nonetheless. We need to move beyond 1950s data models! … But AI evolved from digital computer representations; the representations that ushered in the information age.

Let’s progress the discussion by comparing how a digital computer represents information with Patom brain theory.

As usual, I will use hotlinks to online references for further reading.

Representation: 0 and 1 (processing model?)

We can represent binary code with ‘0’ and ‘1.’ Using base 2, we can string together binary digits to form larger numbers, such as with hexadecimal ‘F’ being binary 1111 (15 decimal). This approach is effective for storing numbers and characters to replicate human-computer skills from the days before digital computers, but digital computers emulate the work of human computers, not their sensory and motor systems.

--

--

Pat Inc
Pat Inc

Published in Pat Inc

A scientific breakthrough in #ConversationalAI. Meaning-based NLU vs. Deep Learning Intent NLU. Sign up for early access: https://pat.ai/

John Ball
John Ball

Written by John Ball

I'm a cognitive scientist working on NLU (Natural Language Understanding) systems based on RRG (Role and Reference Grammar). A mouthful, I know!

Responses (1)