Yoshua Bengio and Gary Marcus on the Best Way Forward for AI

Transcript of the 23 December 2019 AI Debate, hosted at Mila

Moderated and transcribed by Vincent Boucher, Montreal AI

AI DEBATE : Yoshua Bengio | Gary Marcus — Organized by MONTREAL.AI and hosted at Mila, on Monday, December 23, 2019, from 6:3
AI DEBATE : Yoshua Bengio | Gary Marcus — Organized by MONTREAL.AI and hosted at Mila, on Monday, December 23, 2019, from 6:3
AI DEBATE : Yoshua Bengio | Gary Marcus — Organized by MONTREAL.AI and hosted at Mila, on Monday, December 23, 2019, from 6:30 PM to 8:30 PM (EST)

Transcript of the AI Debate

Opening Address | Vincent Boucher

Diagram of a 2-layer Neural Network
Diagram of a 2-layer Neural Network
Diagram of a 2-layer Neural Network
Agenda : The Best Way Forward For AI
Agenda : The Best Way Forward For AI
Agenda : The Best Way Forward For AI

Opening statement | Gary Marcus

Opening statement | Gary Marcus — 22 min.
Opening statement | Gary Marcus — 22 min.
Opening statement | Gary Marcus — 22 min.
Last week at NeurIPS
Last week at NeurIPS
Last week at NeurIPS
Overview
Overview
Overview
Part I: how I see AI, deep learning, and current ML, and how I got here
Part I: how I see AI, deep learning, and current ML, and how I got here
Part I: how I see AI, deep learning, and current ML, and how I got here
A cognitive scientist’s journey, with implications for AI
A cognitive scientist’s journey, with implications for AI
A cognitive scientist’s journey, with implications for AI
1986: Rules versus connectionism (neural networks)
1986: Rules versus connectionism (neural networks)
1986: Rules versus connectionism (neural networks)
the debate
the debate
the debate
1992: Why do kids (sometimes) say breaked rather than broke?
1992: Why do kids (sometimes) say breaked rather than broke?
1992: Why do kids (sometimes) say breaked rather than broke?
1998: Extrapolation & Training Space
1998: Extrapolation & Training Space
1998: Extrapolation & Training Space
1999: Rule learning in 7 month old infants
1999: Rule learning in 7 month old infants
1999: Rule learning in 7 month old infants
2001: The Algebraic Mind
2001: The Algebraic Mind
2001: The Algebraic Mind
2001: symbol-manipulation
2001: symbol-manipulation
2001: symbol-manipulation
The Algebraic Mind
The Algebraic Mind
The Algebraic Mind
Neural-Symbolic Cognitive Reasoning
Neural-Symbolic Cognitive Reasoning
Neural-Symbolic Cognitive Reasoning
2012: The Rise of Deep Learning
2012: The Rise of Deep Learning
2012: The Rise of Deep Learning
Image for post
Image for post
2018: Critique of deep learning
The central conclusions of my academic work on cognitive science, and its implications for AI
The central conclusions of my academic work on cognitive science, and its implications for AI
The central conclusions of my academic work on cognitive science, and its implications for AI
Part II: Yoshua
Part II: Yoshua
Part II: Yoshua
Image for post
Image for post
First things first: I admire Yoshua
Image for post
Image for post
My differences are mainly with Yoshua’s earlier (e.g., 2014–2015) views
Image for post
Image for post
Recently, however Yoshua has taken a sharp turn towards many of the positions I have long advocated
Disagreements
Disagreements
Disagreements
1. Yoshua’s (mis)representation of my position (1 of 2)
1. Yoshua’s (mis)representation of my position (1 of 2)
1. Yoshua’s (mis)representation of my position (1 of 2)
Image for post
Image for post
1. Yoshua’s (mis)representation of my position (2 of 2)
2. What kind of hybrid should we seek?
2. What kind of hybrid should we seek?
2. What kind of hybrid should we seek?
To argue against symbol-manipulation, you have to show that your system doesn’t implement symbols
To argue against symbol-manipulation, you have to show that your system doesn’t implement symbols
To argue against symbol-manipulation, you have to show that your system doesn’t implement symbols
Attention here looks a lot like a means for manipulating symbols
Attention here looks a lot like a means for manipulating symbols
Attention here looks a lot like a means for manipulating symbols
“We tried symbols and they don’t work”
“We tried symbols and they don’t work”
“We tried symbols and they don’t work”
Image for post
Image for post
Mao et al, arXiv 2019
Lots of knowledge is not “conveniently representable” with rules
Lots of knowledge is not “conveniently representable” with rules
Lots of knowledge is not “conveniently representable” with rules
3. Innateness
3. Innateness
3. Innateness
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
4. Brains and neural networks
Image for post
Image for post
First, deep nets aren’t much like brains
What kind of neural network is the brain?
What kind of neural network is the brain?
What kind of neural network is the brain?
Image for post
Image for post
“Symbols aren’t biologically plausible”
Image for post
Image for post
Even if somehow turned that the brain never manipulated symbols, why exclude them from AI?
Image for post
Image for post
5. Compositionality
Recursion, embedding, compositionality
Recursion, embedding, compositionality
Recursion, embedding, compositionality
The semantics of large-scale vector-based systems like BERT aren’t nearly precise enough
The semantics of large-scale vector-based systems like BERT aren’t nearly precise enough
The semantics of large-scale vector-based systems like BERT aren’t nearly precise enough
“You can’t cram the meaning of an entire f***ing sentence into a single f***ing vector”
“You can’t cram the meaning of an entire f***ing sentence into a single f***ing vector”
“You can’t cram the meaning of an entire f***ing sentence into a single f***ing vector”
Image for post
Image for post
Compositionality isn’t just about language…
Part III: Synthesis
Part III: Synthesis
Part III: Synthesis
Conclusions
Conclusions
Conclusions
At the same time
At the same time
At the same time
AI has had many waves that come and go
AI has had many waves that come and go
AI has had many waves that come and go
Prediction: When Yoshua applies his formidable model-building talents to models that acknowledge and incorporate (…)
Prediction: When Yoshua applies his formidable model-building talents to models that acknowledge and incorporate (…)
Prediction: When Yoshua applies his formidable model-building talents to models that acknowledge and incorporate explicit operations over variables, magic will start to happen

Opening statement | Yoshua Bengio

Opening statement | Yoshua Bengio — 20 min.
Opening statement | Yoshua Bengio — 20 min.
Opening statement | Yoshua Bengio — 20 min.
Image for post
Image for post
Debate with Gary Marcus
MAIN POINTS
MAIN POINTS
MAIN POINTS
On the term deep learning
On the term deep learning
On the term deep learning
Agent Learning Needs OOD Generalization
Agent Learning Needs OOD Generalization
Agent Learning Needs OOD Generalization
Compositionality helps iid and ood generalization
Compositionality helps iid and ood generalization
Compositionality helps iid and ood generalization
Systematic Generalization
Systematic Generalization
Systematic Generalization
From Attention to Indirection
From Attention to Indirection
From Attention to Indirection
Consciousness Prior
Consciousness Prior
Consciousness Prior
Consciousness Prior ➔ sparse factor graph
Consciousness Prior ➔ sparse factor graph
Consciousness Prior ➔ sparse factor graph
What causes changes in distribution?
What causes changes in distribution?
What causes changes in distribution?
RIMs: modularize computation and operate on sets of named and typed objects
RIMs: modularize computation and operate on sets of named and typed objects
RIMs: modularize computation and operate on sets of named and typed objects
PRIORS for learning high-level semantic representations
PRIORS for learning high-level semantic representations
PRIORS for learning high-level semantic representations
Contrast with the symbolic AI program
Contrast with the symbolic AI program
Contrast with the symbolic AI program
MY BET: Not a simple hybrid of GOFAI & Deep Nets
MY BET: Not a simple hybrid of GOFAI & Deep Nets
MY BET: Not a simple hybrid of GOFAI & Deep Nets
EXPLICIT or IMPLICIT SYMBOLS?
EXPLICIT or IMPLICIT SYMBOLS?
EXPLICIT or IMPLICIT SYMBOLS?
Let’s Debate!
Let’s Debate!
Let’s Debate!

Response | A Dialog Between Yoshua Bengio & Gary Marcus

Interview | Vincent Boucher : Yoshua Bengio & Gary Marcus (dialog)

Public questions from the audience at Mila | Yoshua Bengio & Gary Marcus

International audience questions | Yoshua Bengio & Gary Marcus

Written by

MONTREAL.AI | Montreal Artificial Intelligence

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store