DeepMind Proposes a Novel Way to Improve GANs Using Gradient Information

Synced
SyncedReview
Published in
4 min readMay 28, 2019

Compressed Sensing’s ability to exploit the structure of natural images and recover an image from limited random measurements have made it an elegant framework for recovering sparse signals from compressed measurements. Although CS is flexible and data efficient, a strong assumption of sparsity and costly reconstruction process have restricted its application.Recent advancements combining CS with neural network generators have broken the limitation caused by sparsity, but not solved the reconstruction problem.

A group of researchers from DeepMind have introduced a novel framework that significantly improves signal recovery performance and speed. The approach involves jointly training a generator and the optimization process for reconstruction via meta-learning. Researchers trained the measurements with various objectives and obtained a family of models aimed at minimizing measurement errors, with generative adversarial networks (GANs) regarded as a “special case” among the models. Researchers proposed a novel way to improve GANs using gradient information from the discriminator based on previous insights from the CS perspective.

The researchers designed their approach to learn measurement functions by imposing a Restricted Isometry Property (RIP) — the center of CS theory — as a training objective. They proposed a Deep Compressed Sensing (DCS) framework and by imposing properties other than the RIP on the measurements, derived two novel models including a GAN model. Experiments showed the GAN model with discriminator-guided latent optimization produced more stable training dynamics and better results.

Rather than relying on auto-encoding models that feature end-to-end trained encoder and decoder pairs, CS can separate encoding and decoding into individual measurement and reconstruction processes and reconstruct signals from low-dimensional measurements through online optimization. For example, in scenarios that present noisy measurements with very little training such as MRIs, CS is highly flexible and sample efficient. However, its assumption of sparse signals and a slow optimization process for reconstruction still restrain the broader application of CS in tasks such as processing large scale data, where deep learning approaches have shown greater potential.

The research team evaluated their DCS model using the MNIST and CelebA datasets. Experiments showed that DCS performs significantly better than the previous baseline model developed by Ashish Bora and other researchers at the University of Texas. Moreover, although the baseline model requires hundreds or thousands of gradient-descent steps with several restarts, DCS models used only three steps without any restarts to achieve efficiency orders of magnitudes higher. Bora endorsed the DeepMind work, tweeting: “Faster and more accurate compressed sensing with meta-learning! Very interesting to see that latent space optimization leads to more stable GAN training :)”.

Researchers also trained a small model on MNIST to evaluate the advantage of latent optimization of their proposed CS-GANs. For quantitative evaluation, researchers trained both larger and standard models on CIFAR 10. Samples from CS-GANs using different gradient descent steps in latent optimisation showed that “optimizing latent variables exhibits no mode collapse, one of the common failure modes of GAN training.”

The paper Deep Compressed Sensing is on arXiv.

Journalist: Fangyu Cai | Editor: Michael Sarazen

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Follow us on Twitter @Synced_Global for daily AI news!

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global