How Brains Understand Language: Part 1 of 2.

John Ball
Pat Inc
Published in
12 min readOct 14, 2021

--

Computers have revolutionized the world with continuous improvements since the 1950s, but they haven’t worked well on biologically-based problems like vision, animal-like movement control and conversation with human languages.

Figure 1. Brains (foreground) are not computers (like background circuit), which is why they are so good at their task and vice versa. What’s different? Image — Adobe Stock.

That’s a shame, because who wouldn’t want to ask Siri or Alexa a question and get the type of response a human would give, if they spoke your language? And how would the world change if we asked a question and current internet information came back sourced from a speaker of another language?

Siri and Alexa have been seeding the possibilities for natural language interaction, whether voice or text, since 2011. But we are stuck with simple commands and dumbed-down requests to meet the machine’s programmed ‘intents’. Something basic is missing.

The best way to fix this is to emulate what a brain does to learn language, and apply it at scale.

While technology today cannot track brain function at the cellular level in real time, which would be helpful to reverse-engineer a human brain, the scientific method should be adequate: observation, hypothesis and experimentation have worked well on other problems. In astronomy, for example, the scientific method broke a 1500-year fixation on the geocentric model that didn’t work to explain observations.

--

--

John Ball
Pat Inc

I'm a cognitive scientist working on NLU (Natural Language Understanding) systems based on RRG (Role and Reference Grammar). A mouthful, I know!