Yann LeCun Delivers Keynote Address at Inaugural NYU Tech Summit
LeCun talked about his start at Bell Labs, his current role at Facebook, and the state of AI today
On Wednesday, November 14, NYU held its first annual Tech Summit. NYU CIO Len Peters announced that the event had almost 1200 registrees and 26 partner vendors involved. He introduced NYU President Andrew Hamilton, who said he was pleased the timing of the event coincided with Amazon’s HQ2 announcement because NYU contributed to the city’s bid. Hamilton welcomed Yann LeCun, Vice President and Chief Artificial Intelligence Scientist at Facebook; Silver Professor of Computer Science, Neural Science, and Electrical and Computer Engineering at NYU; and Founding Director of NYU’s Center for Data Science.
LeCun, who splits his career between academia and industry, discussed his current role at Facebook where he manages about 210 scientists, engineers, postdocs, and PhDs within Facebook’s AI Research division. He emphasized the importance of scientist-driven research, open collaboration, and biological inspiration. LeCun said the neurological development of infants and children particularly interests him because of the speed at which they acquire knowledge and skills. However, he cautioned that data scientists and engineers should aim to understand underlying principles of nature instead of copying it like Clément Ader, who built planes that looked like bats in nineteenth century France.
To explain the current state of deep learning, LeCun gave a brief history of AI development. 95% of what we call AI today is supervised learning, he said, which in its most fundamental form involves a machine with knobs to control inputs and outputs. The 1957 Perceptron was the first such machine, followed by the 1960 Adaline — from the beginning, hardware development has been deeply related to AI development. Interest in machine learning died in the late 1960s, while mainstream AI moved towards methods based on logic and reasoning. In the mid 1980s, LeCun joined Bell Labs to work on neural net hardware, but general interest in neural networks subsided in the mid 1990s because computers were too slow and expensive, data collection applications were narrow, and the complexity of software required levels of investment and collaboration that, at the time, were not available.
In 2003 LeCun worked on his first project at NYU, a rudimentary autonomous driving system involving what was not yet called deep learning. LeCun explained that for deep learning, the process of designing feature extractors (analogous to knobs on early physical machines) is automated rather than hand-engineered. LeCun’s worked on LAGR (Learning Applied to Ground Robots) from 2005–2009, which used labels from a classical vision system to train itself to drive and maneuver around obstacles.