Enabling Continual Learning in Neural Networks with Meta LearningDownload PDF

Anonymous

16 May 2019 (modified: 05 May 2023)Submitted to AMTL 2019Readers: Everyone
Keywords: continual learning, meta learning
Abstract: Catastrophic forgetting in neural networks is one of the most well-known problems in continual learning. Previous attempts on addressing the problem focus on preventing important weights from changing. Such methods often require task boundaries to learn effectively and do not support backward transfer learning. In this paper, we propose a meta-learning algorithm which learns to reconstruct the gradients of old tasks w.r.t. the current parameters and combines these reconstructed gradients with the current gradient to enable continual learning and backward transfer learning from the current task to previous tasks. Experiments on standard continual learning benchmarks show that our algorithm can effectively prevent catastrophic forgetting and supports backward transfer learning.
TL;DR: We propose a meta learning algorithm for continual learning which can effectively prevent catastrophic forgetting problem and support backward transfer learning.
0 Replies

Loading