IBM Uses Continual Learning to Avoid The Amnesia Problem in Neural Networks

Using continual learning might avoid the famous catastrophic forgetting problem in neural networks.

Jesus Rodriguez
DataSeries

--

Source: https://www.ups.com/us/en/services/knowledge-center/article.page?name=we-all-need-to-forget-even-robots&kid=14baf79

I recently started an AI-focused educational newsletter, that already has over 65,000 subscribers. TheSequence is a no-BS (meaning no hype, no news etc) ML-oriented newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

I often joke that neural networks suffers from a continuous amnesia problem in the sense that they every time they are retrained they lost the knowledge accumulated in previous iterations. Building neural networks that can learn incrementally without forgetting is one of the existential challenges facing the current generation of deep learning solutions. Over a year ago, researchers from IBM published a paper proposing a method for continual learning proposing that allow the implementation of neural networks that can build

--

--

Jesus Rodriguez
DataSeries

CEO of IntoTheBlock, President of Faktory, President of NeuralFabric and founder of The Sequence , Lecturer at Columbia University, Wharton, Angel Investor...