Jeff Dean’s 1990 Senior Thesis Is Better Than Yours

Synced
SyncedReview
Published in
3 min readSep 11, 2018

Google AI lead Jeff Dean recently posted a link to his 1990 senior thesis on Twitter, which set off a wave of nostalgia for the early days of machine learning in the AI community.

Parallel Implementation of Neural Network Training: Two Back-Propagation Approaches may be almost 30 years old and only eight pages long, but the paper does a remarkable job of explaining the methods behind neural network training and the modern development of artificial intelligence.

UC San Diego Assistant Professor Philip Guo followed up on Twitter with a question: “Folks who were writing machine learning code over 10 years ago… what languages/libraries/frameworks did you use back then?” Dean responded that he wrote the parallel training code for neural networks in C — a programming language that dates back to the late 1960s.

In the thesis Dean explores two methods for training neural networks in parallel based on backpropagation. The first is a pattern-partitioned approach which maps the entire neural network to each processor and divides the various input modes into available processors; the second is a pipeline approach which loops the network’s neurons back to available processors. Neurons on each processor will then process whichever features pass through.

Dean also built neural networks of different sizes and tested the two methods with a variety of input data.

Dean’s research showed that mode segmentation speeds up when the network and input data are expanded.

The year 1990 marked the end of the “AI Winter” and the dawn of the era of big data that defined AI in the 2000s. AI research in academia was relatively dormant during the AI Winter, as the tech failed to deliver significant or promising initiatives. This resulted in funding being shifted from AI to other research areas.

At the time Dean’s thesis was published, Artificial Neural Networks were a relatively new but growing research area. Years later, Geoffrey Hinton became Dean’s colleague. Regarded as the godfather of deep learning, Hinton published a number of important papers on backpropagation.

After all these years, Dean’s senior thesis is still regarded as a brilliant document. “As always, Jeff Dean Doesn’t fail to inspire respect,” posted Hacker News user Halfings. “My master thesis was ~60 pages long, and was probably about 1/1000 as useful as this one.”

Journalist: Fangyu Cai | Editor: Michael Sarazen

Follow us on Twitter @Synced_Global for more AI updates!

Subscribe to Synced Global AI Weekly to get insightful tech news, reviews and analysis! Click here !

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global