ACM Announces Best Doctoral Paper

Synced
SyncedReview
Published in
3 min readMay 23, 2019

The Association for Computing Machinery (ACM) has announced the recipient of its 2018 ACM Doctoral Dissertation Award. Chelsea Finn, a PhD student from UC Berkeley, took the top honour with her paper Learning to Learn with Gradients.

A popular machine learning research topic, meta-learning algorithms learn how to learn by using previous data to quickly adapt to new tasks. Initial research in meta-learning put a major focus on designing complex neural network architectures.

Finn is a research scientist at Google Brain and a postdoctoral researcher at the Berkeley AI Research Lab (BAIR). In her doctoral paper she introduced a new gradient-based meta-learning algorithm, Model-Agnostic Meta-Learning (MAML), which enables deep networks to solve new tasks based on small datasets. The research draws on the manner in which human learning leverages prior experience, concepts and abstractions; and is expected to free computer scientists from heavy workloads in manually designing complex architectures.

Unlike previous meta-learning approaches, MAML focuses on obtaining transferable representations rather than good learning rules. This enables MAML methods to inherit the useful features of fixed optimization as a learning rule while retaining full expressivity, as the learned representations can control the update rule, a key meta-learning algorithm that updates one parameter variable using the gradient of a loss function.

The ACM says the MAML approach “has had a huge impact in this area and has been widely used in other areas of reinforcement learning, computer vision and machine learning.”

Also announced were the Honorable Mentions for the 2018 ACM Doctoral Dissertation Award: Princeton University Computer Science graduates Ryan Beckett and Tengyu Ma.

Ryan Beckett is a researcher in the Microsoft Research mobility and networking group. In his PhD thesis Network Control Plane Synthesis and Verification, Beckett proposed a general and efficient algorithm for creating and verifying network control plane configurations.

Ma Tengyu is an assistant professor in the Department of Computer Science and Statistics at Stanford University. He proposed a new theory that supports the trend of machine learning in his doctoral thesis Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding. His theory supports the proof of the convergence of machine learning nonconvex optimization algorithms and outlines the characteristics of machine learning models trained by using this method.

Author: Yuqing Li | Editor: Michael Sarazen

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Follow us on Twitter @Synced_Global for daily AI news!

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global