AISaturdayLagos: The Brain And The Model

Tejumade Afonja
AI Saturdays
Published in
5 min readFeb 22, 2018
source

“I am a brain, Watson. The rest of me is a mere appendix. Therefore, it is the brain I must consider.”
Arthur Conan Doyle, The Adventure of the Mazarin Stone

AI Saturdays Lagos Week 7 was held on 17th Feb, 2018. We continued with Collaborative Filtering Embeddings and then did an introduction to Recurrent Neural Networks, we ran through fast.ai code-base for this week.

After Lunch, we moved on to Stanford CS231n Training Neural Networks (part 2) where we talked about Francier Optimization, Regularization and Transfer Learning. After which, we went through this week’s paper which was on Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice.

We concluded this week with Project discussion amongst different groups.

Kenechi Franklin Dukor wrote a fascinating post about The Brain and The Model.

THE BRAIN AND THE MODEL

Terminologies and Concepts that Relates Deep Learning with the Brain

One of the factors that made Deep Learning frustrating for me was the numerous new and strange terminologies associated with it. Over time, with the help of some lectures and learning, some analogies came up in my head that helped relate Deep Learning with the brain.

DEEP LEARNING MODEL

A model is basically an equation that takes in input and produces a result (output). Models try to predict real life occurrences. Deep Learning models are just like any other models.

I guess we can relate the human brain to a model.

Different brains perform different functions. For example, Isaac Newton’s brain is good at producing intuitions based on observation, a traffic wards brain is good at controlling traffic, and so on.

General Deep Learning Model (source)

Permit me to refer to the deep learning model as DL-brain.😁

NEURAL NETWORKS

The idea of a network is just an interaction of different nodes; like Facebook Network, Electrical Network and Internet Network.

As you must have figured out, the word Biological Neural Network has to do with the unique structure and working of the brains.

Since, Deep Learning is trying to create models that perform just like the brain, it is fair enough to borrow the word neural; so we have the word Neural Networks for deep learning too.

The network activity involves the transfers information from node to node through neurons (biological or artificial) and the different ways these networks are connected is called its architecture.

Neural Structure of the Human Brain and DL-Brain
What happens at the node (source: giphy.com)

DATA-SET

This is basically a large chunk of data we want the DL-brain to understand (learn).

You will agree with me that a new born baby’s brain is unlearned and will only perform its predefined functions like blinking, crying, smiling etc; until the baby is exposed to the different kinds of information (data-set) in the world.

Similarly, our DL-brain has to be exposed to different information (data-sets). This process of exposing the DL-brain (model) to different data-set is called training; yeah, just like training a child in school or at home.

Learn! Learn! … Data! Data! lots of Data! (source: allenai.org)

But how does the training take place in our DL brain?

WEIGHTS

At the first ever sight of a dog (say a Doberman), the human brain gets some information of what a dog looks like, but might fail to identify a Caucasian as a dog when it sees one later.

If the brain gets access to other varieties of dogs, it figures out different features that makes a dog a dog. This information is usually stored somewhere in the brain (memory); and I guess this memory is called weight in our DL-brain.

Remember our DL-brain formula (model)? The variable ‘W’ is actually the weight. It is that one value that has the precious information that makes the DL-brain label images correctly, identify sound and recognize speech when it comes across one.

In other words, weight holds specific details (information) of specific features in the data-sets that it came across.

when the DL-brain remembers an image (Image classification) (source: giphy.com)

For the maths guys — It’s just that parameter that we tweak in a model during parametric studies.

So, when we say “training a model”, we basically mean getting the right values for the weight. Once a good weight has been gotten, we can say that the DL-brain has learnt the data.

But how do these weight in the DL-brain get this information?

…continue reading

Thank you Kenechi Franklin Dukor for an interesting analogy of the brain and the model.

AISaturdayLagos wouldn’t have happened without my fellow ambassador Azeez Oluwafemi, our Partners FB Dev Circle Lagos, Vesper.ng and Intel.

A big Thanks to Nurture.AI for this amazing opportunity.

Also read how AI Saturdays is Bringing the World Together with AI

See you next week 😎.

View our pictures here and follow us on twitter :)

--

--