Studying Generalization on Memory-Based Methods in Continual Learning

Published: 03 Jul 2023, Last Modified: 10 Jul 2023LXAI @ ICML 2023 Regular Deadline PosterEveryoneRevisionsBibTeX
Keywords: continual learning, out-of-distribution generalization, systematic generalization, spurious correlations
TL;DR: We study spurious correlations learned by replay based models in a continual learning setting.
Abstract: One of the objectives of Continual Learning is to learn new concepts continually over a stream of experiences and at the same time avoid catastrophic forgetting. To mitigate complete knowledge overwriting, memory-based methods store a percentage of previous data distributions to be used during training. Although these methods produce good results, few studies have tested their out-of-distribution generalization properties, as well as whether these methods overfit the replay memory. In this work, we show that although these methods can help in traditional in-distribution generalization, they can strongly impair out-of-distribution generalization by learning spurious features and correlations. Using a controlled environment, using the Synbol benchmark generator (Lacoste et al., 2020), we demonstrate that this lack of out-of-distribution generalization mainly occurs in the linear classifier.
Submission Type: Non-Archival
Supplementary Material: pdf
Submission Number: 9
Loading