More compute is not the solution for AGI

Sophia Aryan
BuzzRobot
Published in
2 min readOct 18, 2019

There are different bets on how to achieve AGI (Artificial General Intelligence). It’s nearly impossible to predict which path has the best probability of succeeding. It’s still worth considering the possible approaches, though.

The current most popular approach is to throw more {C|G|T}PUs at problems of intelligence. But let’s look at this problem from a different angle: how evolution solves the problem of developing more sophisticated intelligent systems considering the limitations of the environment.

Energy efficiency ratio — how much energy an organism spends to gain energy versus how much energy it actually uses — is an important factor in evolution.

Let’s look at this example: Plankton floating in the sea grow through photosynthesis. Then fish eats the plankton and turns it into energy to grow. Then an even bigger, more intelligent organism eats the fish. The cycle continues up to humanity using more complicated systems for energy conversion.

Intelligence evolves from the necessity of an organism to acquire energy considering natural limitations. This is how organisms learn to hack, find shortcuts, and outsmart opponents — the urgency of a lack of resources drives alternate possible solutions.

Photo by Markus Spiske on Unsplash

The belief that a massive computational power will solve the AGI problem is limited by the energy efficiency ratio. Compute power is also finite and at some point it will hit physical limitations. And it’s not clear that the highest performance of computational power (even if efficiency is not a priority and there are endless financial resources) will result in truly intelligent systems.

More power could be given to a calculator to compute 2 + 2, but if this is the only thing it can do, it doesn’t matter how much power is available.

It’s not obvious at all how to develop an agent capable of abstract thinking.

Moreover, abstract thinking has evolved through necessity to encapsulate the richness of reality into a model of the reality. That doesn’t come just with more energy or computational power. The human brain consumes 20% of the energy — it’s a lot, but it’s within the balance and fair distribution of energy to support the entire system.

There should be another approach to bring abstract thinking and more powerful cognitive abilities to AI. Human beings may never be able to accomplish that, but self-improving AI systems may create on their own a better version of themselves by constant learning, competition and adaptability.

--

--

Sophia Aryan
BuzzRobot

Former ballerina turned AI writer& communicator. OpenAI alumni. Fan of astrophysics and deep conversations. Founder of BuzzRobot