Learning to Remember: A Synaptic Plasticity Driven Framework for Continual Learning?

Oleksiy Ostapenko, Tassilo Klein, Mihai Puscas, Patrick Jaehnichen and Moin Nabi

SAP AI Research
SAP AI Research
1 min readMar 26, 2019

--

Conference on Computer Vision and Pattern Recognition (CVPR 2019), California, USA (to appear)

Models trained in the context of continual learning (CL) should be able to learn from a stream of data over an undefined period of time. The main challenges herein are: 1) maintaining old knowledge while benefiting from it when learning new tasks, and 2) guaranteeing model scalability with a growing amount of data to learn from. In order to tackle these challenges, we introduce Dynamic Generative Memory (DGM) — a synaptic plasticity driven framework for continual learning. DGM relies on conditional generative adversarial networks with learnable connection plasticity realized with neural masking. Specifically, we evaluate two variants of neutral masking: applied to (i) layer activations and (ii) to connection weights directly. Furthermore, we propose a dynamic network expansion mechanism that ensures sufficient model capacity to accommodate for continuously incoming tasks. The amount of added capacity is determined dynamically from the learned binary mask. We evaluate DGM in a continual learning setup on visual classification tasks and compare to the state-of-the-art methods.

PDF

--

--