AGI Comprehension Achieved

Mounir Sita
Kimera Systems
Published in
3 min readSep 17, 2017

Today we are announcing some exciting news (press release found here). About a week ago, Nigel AGI recorded its first scenario that it truly comprehended using our mathematical model for comprehension. Comprehension may not be the typical discussion point in the field of artificial intelligence. We’re still “stuck” at the deep learning level. Let me start by explaining intelligence and comprehension.

At AI World Conference & Expo 2016 we released our General Theory of Intelligence whitepaper. At the beginning of our research, back in 2005, we realized that it would be an almost impossible task to build thinking machines if we couldn’t even define intelligence scientifically.

Our theory is not the first theory of intelligence proposed. A few theories have emerged in the past century (a good overview can be found here: What are the different theories of intelligence?). However, these are, without an exception, human-centric theories that can’t easily be applied to other species (animals or potential aliens across the universe).

To us, that human centric approach was unacceptable. We wanted to develop a theory of intelligence that was general in nature — something that could be applied to anyone and anything across the universe.

With those goals our research took us down the path of quantum mechanics and physics. Starting with the assumption that intelligent people are more effective at reaching their goals, we focused on the “goal”. Without going into the details it is suffice to say that no matter what goals you have, the only way to realize your goal is to change the composition of matter and energy in your little corner of the universe.

Think about that for a second, no matter what your goal is, develop a cure for cancer, host a dinner party or invent an interstellar spaceship, the common thing between all these goals is that you have to create a sequence of particle movements — in other words, change the composition of the universe. That is why we define intelligence as creating a sequence of particle movements.

This is a definition that is not limited to humans. It can be applied to animals, aliens and even microbes. In fact, if you think about it, it can also be applied to particles and planets. Because of the suns gravity, it creates an (almost) never ending sequence of particle movements that we recognize as planets orbiting the sun. In other words, intelligence is woven into the fabric of the universe.

Of course, talking “particle movements” may sound a little too concrete. For every change inflicted in reality, an action is needed. From an AI and human perspective it might be easier to just say a sequence of actions.

So what is comprehension? Let me make something clear, according to the general theory of intelligence, comprehension is not binary. It is not that you either comprehend something or you don’t. Comprehension is probabilistic. It is the probability that a sequence of particle movements, or actions, leads to a pre-defined goal.

From an algorithmic perspective, with Nigel, we define comprehension even more concretely. As a part of the knowledge entanglement process, we consider comprehension as being able to abstract memorized knowledge, using “interpretation nodes”, into concepts and then using those concepts to create a sequence with a high comprehension factor (CF).

Figure 1 — Kylee Model

Last August, just days after we launched our private beta, we announced that Nigel had learned its first common sense. It took Nigel a handful of user and a few days to memorize and learn that it is common sense to silence your phone at the cinema.

At this stage, Nigel had developed a memorized understanding of disparate sensor observation. This understanding was limited to movie theaters.

Last week, Nigel was able to abstract that memorization into a concept using just the interpretation nodes. While the immediate result is the same, silencing phones at movie theaters, the long term impact is more interesting. By abstracting the knowledge of keeping quiet away from the movie theater domain, Nigel can now learn how to use this knowledge in other domains, for example at libraries or in class at school.

Abstracting memorized domain specific knowledge into domain independent knowledge is key to general AI. That is what we proved we were capable of doing last week.

Please join our beta: https://play.google.com/store/apps/details?id=ai.kimera.nigel

Also, follow us on YouTube to learn more about our approach to AGI and its business: https://www.youtube.com/channel/UCEvuaO0z1kqnxUIc5InHqFw

--

--