Back in February 2015 I have written some random thoughts on our current approaches to Artificial Intelligence and Machine Learning and I have observed based on Google’s yesterday announcements on I/O 2017 that some of my points are valid and empirically proven.
Follows the full essay dated back to Feb 22, 2015 and posted on my personal Facebook timeline:
You see, the problem with actual artificial intelligence approaches is that they focus too much on the software while in my view the problem lies in the current hardware architecture.
Yes, our brains are like computers and we do rely on electrical current and pulses for our own human automata to work. But unlike conventional computers and server processors we are using today, we do not store data in a binary way nor we have binary registers.
The more I study the human brain in order to automate it, the more I observe that our electrical synapses do act as transistors most of the time, but most of our brain data flow secrets reside in the chemical synapses which are much more complex than any of our low level components that make any of current state CPUs available today, even more complex than quantum processors since chemical synapses can express more than 1qubit of data with all the possible combinations taking in consideration all kinds of neurotransmitters that can be released in the presynaptic neurons through the calcium channels.
So, the secret lies on the machine and not in the software. Perhaps we need to rethink the whole approach to AI and build a specific purpose computer, like Bitcoin mining ASICs, but more complex on the lower level components… if the intention is to truly build a human brain-like device we should focus on brain mechanisms and not on trying to simulate it on binary machines (using software perceptrons).
I think that by making a more analog approach to the subject we can make a machine much more similar to the current human brain.
We should take into consideration neural coding theories and convert it to machine automata. The human brain deals with continuous on demand sectorial variable frequencies and tension and not on binary switching or simple logic gates as traditional computers. Also information storage on the human brain is far more complex than what we use today for computers as it changes the processing unit as a whole and not just store / read / write… it is like the firmware of the whole thing rewires / reprogram itself every time you learn or experience something new.
those are just random night thoughts on AI… and perhaps how stupidly we are approaching the problem today…