Written by Christy Maver (Numenta VP of Marketing) and Dr. Michael Riendeau

We recently started a video series of conversations with leading intellectuals on the topics outlined in A Thousand Brains by Jeff Hawkins. This series aims to extend the conversations that the book has started on the implications and applications of Jeff’s novel theory of how our brain works.

How can we apply the Thousand Brains Theory to pedagogy?

To kick off this series, we were joined by Dr. Michael Riendeau and his two students, Ranger Fair and Jacob Shalmi from Eagle Hill School. …


Written by Charmaine Lai, Numenta Marketing Associate

Geoffrey Hinton recently published a paper “ How to Represent Part-Whole Hierarchies in a Neural Network” and presented a new theory called GLOM.

We’ve been getting a lot of questions lately as to the differences between Hinton’s GLOM model and Numenta’s Thousand Brains Theory. In this blog, I will outline the commonalities and main differences of both models at a high level. Those who want more details can watch the presentation by our researcher Marcus Lewis where he discussed the GLOM model through the lens of the Thousand Brains Theory.

What is the Thousand Brains Theory of Intelligence?

The Thousand…


Written by Karan Grewal, Numenta Research Staff Member

Early in life, humans first learn to walk, and then a few years later, they learn to ride bicycles, and finally, as young adults, they learn to drive cars. In learning how to do each new task, humans don’t forget previous ones. Artificial neural networks, on the other hand, struggle to learn continually and consequently suffer from catastrophic forgetting : the tendency to lose almost all information about a previously learned task when attempting to learn a new one.

In this post, I will describe the technicalities of why neural networks do…


Written by Charmaine Lai (Numenta Marketing Associate) and Niels Leadholm

If you’ve watched our research meetings for the past few weeks, you may see a fresh face or hear a new voice on our videos. That is the face and voice of a PhD student Niels Leadholm, who spent 12 weeks with Numenta as a visiting research scholar. As one of Numenta’s first “virtual” interns, I asked Niels to share his work and experience interning at Numenta.

Q1: Hi Niels, can you tell us a bit about yourself and your area of research and expertise?

Sure! I’m a PhD student at the Oxford Lab for Theoretical Neuroscience and Artificial Intelligence . My interest is in understanding primate vision…


Written by Charmaine Lai, Numenta Marketing Associate

AlphaGo, a computer Go program developed by Google DeepMind, might have beaten 18-time world champion Lee Sedol at a Go match, but Lee used a mere 20 Watts to operate, less power than a lightbulb. In contrast, AlphaGo used 1,920 CPUs and 280 GPUs, which is 50,000 times as much power as what Lee’s brain uses.

Deep learning networks are hitting bottlenecks when they scale to more complex tasks and bigger models. Many of these models require enormous amounts of power, raising sustainability issues and creating environmental threats. …


Written by Lucas Souza, Numenta Research Staff Member

About a year ago, in the post The Case for Sparsity in Neural Networks, Part 1: Pruning , we discussed the advent of sparse neural networks, and the paradigm shift that signals models can also learn by exploring the space of possible topologies in a sparse neural network. We showed that combining gradient descent training with an optimal sparse topology can lead to state of the art results with smaller networks. …


Written by Vicenzo Lomanco, Numenta Visiting Research Scientist

My name is Vincenzo Lomonaco and I’m a Postdoctoral Researcher at the University of Bologna where, in early 2019, I obtained my PhD in computer science working on “ Continual Learning with Deep Architectures “ in the effort of making current AI systems more autonomous and adaptive. Personally, I’ve always been fascinated and intrigued by the research insights coming out of the 15+ years of Numenta research at the intersection of biological and machine intelligence. …


Written by Lucas Souza, Numenta Research Staff Member

Last week at Numenta we held our monthly Brains@Bay meetup, gathering data scientists and researchers in the Bay Area to talk about Sparsity in the brain and in Neural Networks (recordings available here). Sparsity is a topic we’ve also been extensively discussing in our research meetings and journal clubs in the past weeks.

Sparsity has long been a foundational principal of our neuroscience research, as it is one of the key observations about the neocortex: everywhere you look in the brain, the activity of neurons is always sparse. Now as we work…


Written by Jeff Hawkins, Co-founder and Christy Maver, VP of Marketing

First posted in March 2018; updated in January 2019

In our most recent peer-reviewed paper, A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex, we put forward a novel theory for how the neocortex works. The Thousand Brains Theory of Intelligence proposes that rather than learning one model of an object (or concept), the brain builds many models of each object. Each model is built using different inputs, whether from slightly different parts of the sensor (such as different fingers on your hand) or…


Written by Donna Dubinsky, CEO and Christy Maver, VP of Marketing

Last month, Numenta released a major new theory for intelligence and cortical computation. As we do with all of our research, the team documented the theory in a research paper and made it available on a preprint server while kicking off the submission process with a peer-reviewed journal. However, we also did something we’ve never done before: we (Donna and Christy) created a “companion piece” to the research paper. What’s a companion piece and why did we write it? How did two non-neuroscientists write a paper about a neuroscience…

Numenta

Where Neuroscience Meets Machine Intelligence

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store