Understanding Catastrophic Forgetting and Remembering in Continual Learning with Optimal Relevance MappingDownload PDF

Published: 10 Dec 2021, Last Modified: 05 May 2023NeurIPS 2021 Workshop MetaLearn PosterReaders: Everyone
Keywords: continual learning, lifelong learning, catastrophic interference
Abstract: Catastrophic forgetting in neural networks is a significant problem for continual1learning. A majority of the current methods replay previous data during training, which violates the constraints of a strict continual learning setup. Additionally, current approaches that deal with forgetting ignore the problem of catastrophic remembering, i.e. the worsening ability to discriminate between data from different tasks. In our work, we introduce Relevance Mapping Networks (RMNs). The mappings reflect the relevance of the weights for the task at hand by assigning large weights to essential parameters. We show that RMNs learn an optimized representational overlap that overcomes the twin problem of catastrophic forgetting and remembering. Our approach achieves state-of-the-art performance across many common continual learning benchmarks, even significantly outperforming data replay methods while not violating the constraints for a strict continual learning setup. Moreover, RMNs retain the ability to discriminate between old and new tasks in an unsupervised manner, thus proving their resilience against catastrophic remembering
TL;DR: Paraemeter Relevance Mapping method to alleviate Catastrophic Forgetting and Remembering in a strict Continual Learning Setup.
Contribution Process Agreement: Yes
Poster Session Selection: Poster session #3 (16:50 UTC)
0 Replies