Facebook Randomly Wired Neural Networks Outperform Human-designed for Image Recognition

Synced
SyncedReview
Published in
6 min readApr 19, 2019

Neural networks for image recognition have matured from simple chain-like models to structures with multiple wiring paths. The emergence of Neural Architecture Search (NAS) can optimize models with more elaborate wiring and operation types. Although this has made NAS a promising research direction, a NAS network generator still must be hand-designed, and the search space is constrained. In a new Facebook AI Research (FAIR) study, researchers explore a more diverse set of connectivity patterns through randomly wired neural networks. The experiments achieve a satisfying result, outperforming human-designed networks such as ResNet and ShuffleNet in image recognition tasks.

“In this paper, we explore a more diverse set of connectivity patterns through the lens of randomly wired neural networks. To do this, we first define the concept of a stochastic network generator that encapsulates the entire network generation process. Encapsulation provides a unified view of NAS and randomly wired networks. Then, we use three classical random graph models to generate randomly wired graphs for networks. The results are surprising: several variants of these random generators yield network instances that have competitive accuracy on the ImageNet benchmark. These results suggest that new efforts focusing on designing better network generators may lead to new breakthroughs by exploring less constrained search spaces with more room for novel design.” (arXiv).

Synced invited Tongliang Liu, an Assistant Professor who focuses on Machine Learning and Computer Vision at the University of Sydney, to share his thoughts on the FAIR’s Exploring Randomly Wired Neural Networks for Image Recognition.

How would you describe randomly wired neural networks?

Designing efficient and effective architectures of neural networks is crucial for the development of deep learning. Kaiming He and his team from Facebook AI Research (FAIR) have been in the forefront of the area. After developing Deep Residual Networks (ResNets), they recently proposed Randomly Wired Neural Networks with a novel way to wire neural networks.

Randomly Wired Neural Networks are not entirely randomly generated. They still include some human-designed structures, i.e., the networks are manually divided into stages with progressively downsampled representations. However, this differs from the traditional networks in that the node connections in each stage are randomly generated rather than human-designed. Specifically, stochastic network generators (SNGs) with random seeds are designed to randomly generate graphs which are then subsequently mapped into computable stages. It is very interesting that some Randomly Wired Neural Networks with specific SNGs outperform or are comparable to state-of-the-art networks. I have to note that those SNGs are designed by employing classical families of random graph models in graph theory to reduce bias from the network designers, showing that exploring SNGs with prior knowledge could be a promising research direction for network designing.

Why does this research matter?

Deep neural networks with different wiring patterns and operations are powerful in processing data and learning new representations. The way networks are wired will have a big influence on their performances as well as their properties. Currently, the well-known wiring patterns include full wiring and some human-designed wiring. Because of problems such as very large hypothesis class and non-convexity, fully connected networks, e.g., Multilayer Perceptron (MLP), fail to provide leading performances in some applications; while some networks with human-designed wiring, such as Convolutional Neural Networks (CNN) and their variants (e.g., ResNet and DenseNet) that have constraints on their wiring patterns, obtain dominant results in some specific tasks, e.g., image classification. One natural question which arises is whether any better wiring patterns exist. Or what is the optimal pattern? Recently, Zoph et al. proposed Neural Architecture Search (NAS) to learn wiring patterns directly on the interested dataset using reinforcement learning, while Randomly Wired Neural Networks make another big step forward by successfully showing that even randomly wired networks have exceptional performances and that there may be a large room for the improvement of network wiring.

What impact might this research bring to the research community?

The research opens a new door for designing architectures of neural networks. I would like to quote the authors’ claim: the research suggests a new transition from designing an individual network to designing a network generator may be possible (given that several variants of SNGs output networks with competitive accuracies on ImageNet and that the variance of accuracy is low for different SNGs), analogous to how our community have transitioned from designing features to designing a network that learns features. Randomly Wired Neural Networks have notable performances on image classification. They may also have noteworthy performances on classification-related vision tasks, such as image segmentation, object detection, and depth estimation.

Can you identify any bottlenecks in the research?

Although Randomly Wired Neural Networks have remarkable performance, the mathematical foundations of SNG remain elusive. Why could SGNs lead to state-of-the-art performances? Why different SGNs have different performances? What are the roles of SNGs in feature learning? They lack mathematical interpretation. Currently, we do not know the underlying principles that guarantee their success. It may be hard to deliberately adapt a SNG for achieving different purposes or pertinently strengthen its generalisation ability.

Can you predict any potential future developments related to this research?

The research would definitely evoke interest from the community in exploring innovative SNGs. Other than employing the families of random graph models from graph theory, designing random graph models pointedly for specific tasks is also expected. Incorporating different prior knowledge into SNGs and studying their wiring spaces should be a future direction to construct effective generators.

The paper Exploring Randomly Wired Neural Networks for Image Recognition is on arXiv.

About Dr. Tongliang Liu

Tongliang Liu is a Lecturer (Assistant Professor) in Machine Learning with the School of Computer Science at the University of Sydney, and a core member of the UBTECH Sydney AI Centre. His research focuses on providing mathematical understandings for machine learning models and designing novel and theoretically grounded learning algorithms for problems in computer vision and data mining, with a particular emphasis on learning with noisy or weak supervision. He is a recipient of the Discovery Early Career Researcher Award (DECRA) from the Australian Research Council (ARC). Read his recent research on label noise here: Learning with bounded instance-and label-dependent label noise.

Synced Insight Partner Program

The Synced Insight Partner Program is an invitation-only program that brings together influential organizations, companies, academic experts and industry leaders to share professional experiences and insights through interviews and public speaking engagements, etc. Synced invites all industry experts, professionals, analysts, and others working in AI technologies and machine learning to participate.

Simply Apply for the Synced Insight Partner Program and let us know about yourself and your focus in AI. We will give you a response once your application is approved.

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Follow us on Twitter @Synced_Global for daily AI news!

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global