Photo by jesse orrico on Unsplash

Why AI researchers need to keep a close eye on what’s going on in brain science

There is a fascinating ballet happening between artificial intelligence and the human brain and its emergent properties. One helps us peer deeper into the other, which in turn throws light on the other.

Brain researcher, Christopher Koch argues that we’ll need high-tech brain enhancements to keep up in a world of relentlessly improving artificial intelligence. Education alone won’t be enough. (WSJ registration required.)

We need to enhance our cognitive capabilities by directly intervening in our nervous systems. […] People could set their brains to keep their focus on a task for hours on end, or control the length and depth of their sleep at will.
Another exciting prospect is melding two or more brains into a single conscious mind by direct neuron-to-neuron links. This entity could call upon the memories and skills of its member brains, but would act as one “group” consciousness, with a single, integrated purpose to coordinate highly complex activities across many bodies.

Purdue researchers have paired up high-fidelity fMRI scans with deep learning networks to decode videos people are watching, from brain activity alone. This is approaching real-time mind-reading.

And in an n=34 study, Carnegie Mellon researchers were able to identify suicidal thoughts by looking at fMRI scans.

That’s one half of the AI/brain pairing. How AI is better helping us understand & possibly impact our brains is the other half.

Geoff Hinton, the “grandfather of deep learning”, has a new approach to image classification called capsule networks. I’ve not read the academic paper, but this helpful write-up by Debarko De suggests that the capsule networks depart from the traditional deep learning approach. That approach has been to engineer the networks with as little prior information or structure as possible, and just have them learn from data. This approach has yielded significant results since the explosion of digital data post the Web. But it also produces very brittle systems, that lend themselves to trivial adversarial attacks or an inability to work with a slight change of context or perspective.

Capsule networks involve the use of capsules, which seem to encapsulate what grizzled-machine vision folk would have called feature engineering, and are essentially a mechanism for bootstrapping the learning process by giving the network some priors. Gary Marcus’ comment in the Wired piece is worth reading.

Jurgen Schmidhuber, the forgotten uncle of deep learning, writes with a lofty vision, eyes gazing firmly at the stars. His paean to long short-term memory (LSTM, a type of neural network) and its place in human progress, is worth pondering:

Humans won’t play a significant role in the spreading of intelligence across the cosmos. But that’s OK. Don’t think of humans as the crown of creation. Instead view human civilization as part of a much grander scheme, an important step (but not the last one) on the path of the universe towards higher complexity. Now it seems ready to take its next step, a step comparable to the invention of life itself over 3.5 billion years ago.
This is more than just another industrial revolution. This is something new that transcends humankind and even biology. It is a privilege to witness its beginnings, and contribute something to it.

If you liked reading about the link between AI & brains, you’ll appreciate reading my weekly newsletter Exponential View. Sign up below!