Synced
Synced
Aug 14 · 3 min read

When training deep neural networks, the goal is to automatically discover good “internal representations.” One of the most widely accepted methods for this is backpropagation, which uses a gradient descent approach to adjust the neural network’s weights. Now, researchers from the Victoria University of Wellington School of Engineering and Computer Science have introduced the HSIC (Hilbert-Schemidt independence criterion) bottleneck as an alternative to backpropagation for finding good representations.

The new method has several distinct advantages. Instead of solving problems by using the chain rule as traditional backpropagation does, HSIC solves problems layer-by-layer, eliminating problematic vanishing and exploding gradient issues found in backpropagation. It also facilitates parallel processing for training layers, and as a result requires significantly fewer operations. Finally, the proposed method removes backward sweeps to eliminate the requirement for symmetric feedback.

Researchers presented two approaches from a HSIC-bottleneck trained network to produce usable classifications. The first approach is a standard feedforward network (above left), generating one-hot results that can be directly permuted to perform classification. The second approach is the σ-combined network (above right), in which researchers simply append a single layer as an aggregator to assemble all the hidden representations so that each is trained with a specific σ, with the need to provide all information at different scales σ to the post training.

The research team also conducted experiments on the MNIST, FashionMNIST, and CIFAR 10 datasets for classic classification problems. Some of the results are shown below:

In experiments comparing the ResNet post and ResNet backpropagation methods, the HSIC bottleneck provides a significant boost in performance, which opens the possibility of learning classification tasks at near-competitive accuracy but without the limitations of backpropagation.

While backpropagation still plays a core role in most AI research, leading AI scientists are exploring alternatives. Last year backpropagation pioneer Dr. Geoffrey Hinton suggested the research community should ditch the technique and focus on unsupervised learning instead: “I don’t think [backpropagation] is how the brain works. We clearly don’t need all the labeled data.”

The paper The HSIC Bottleneck: Deep Learning Without Back-Propagation is on arXiv.


Author: Hecate He | Editor: Michael Sarazen


We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.


Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!


2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

SyncedReview

We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.

Synced

Written by

Synced

AI Technology & Industry Review — syncedreview.com | Newsletter: goo.gl/Q4cP3B | Become Synced Insight Partner: goo.gl/ucXZDw | Twitter: @Synced_Global

SyncedReview

We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade