Creating new neuron model for artificial neural networks

Orlovsky Maxim
BICA Labs
Published in
2 min readApr 3, 2018

Since the introduction of McCulloch-Pitts neuron in 1943 and Hebbian’s learning principle in 1949, a little progress has been made in developing artificial neural networks (ANNs) with properties resembling neurophysiology of the living neural tissue. Different ANNs variations, like Hopfield networks, or machine-learning approaches, including Boltzman machine, Bayesian belief networks, Kohonen self-organizing maps etc. are based on just mathematical and statistical models and have much less common with neurophysiological brain properties then feed-forward or recurrent ANNs based on McCulloch-Pitts and Hebb works.

Recent progress in deep neural networks has drawn inspiration from structure of brain visual cortex and has been proven as a huge step forward in complex tasks like computer vision and natural language processing; however it mimics only large-scale brain architecture (like multiple neural layers, with convolution) and does not address issue of better representation of neuron- and synaptic-level properties in artificial neuron and learning models themselves.

In BICA Labs we working on a new ANN paradigm that mingles properties of feed-forward and recurrent ANNs and has an embedded learning rule (named “local learning”) with a potential to be more efficient than classical back-propagation in a number of ways. For instance, it can be resistant to local minima problem, vanishing gradient problem, significant volumes of computation required for back-propagation algorithms. Combination of feed-forward and recurrent properties provides a basis for future architectures that would be able to natively introduce high-order abstractions and symbol formation inside ANN itself; the proposed approach might give an advantage on the route to generic (human-level) artificial intelligence.

Our research in this direction has started a year ago, and since that time some original papers in the field have appeared: https://www.nature.com/articles/s41598-018-23471-7

This proves that the direction can be very promising.

BICA Labs searches for part-time researchers that like to join the team working on the project. We are looking for the following qualifications:

– math & analysis (understand maths of the modern ANNs)

– ability to code and test new types of ANNs without relying on Tensorflow and other instruments

Please contact us on https://m.me/bicalabs

#AI #Hebian #GradientDecent #StrongAI #ANN #MachineLearning #NewFrontiers

--

--

Orlovsky Maxim
BICA Labs

Ex Tenebrae sententia: sapiens dominabitur astris. Computer scientist, neuroscientist, cypherpunk. #AI #robotics #transhumanism. Creator of #RGB