Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Continual Learning, Lifelong Learning, Inductive Bias, Multitask Learning, Catastrophic Forgetting, Experience Rehearsal, Task Attention
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Continual learning (CL) remains one of the long-standing challenges for deep
neural networks due to catastrophic forgetting of previously acquired knowledge.
Although rehearsal-based approaches have been fairly successful in mitigating
catastrophic forgetting, they suffer from overfitting on buffered samples and prior
information loss, hindering generalization under low-buffer regimes. Inspired
by how humans learn using strong inductive biases, we propose IMEX-Reg to
improve the generalization performance of experience rehearsal in CL under low
buffer regimes. Specifically, we employ a two-pronged implicit-explicit regular-
ization approach using contrastive representation learning (CRL) and consistency
regularization. To further leverage the global relationship between representations
learned using CRL, we propose a novel regularization strategy to guide the clas-
sifier toward the activation correlations in the unit hypersphere of the CRL. Our
results show that IMEX-Reg significantly improves generalization performance and
outperforms rehearsal-based approaches in several CL scenarios. It is also robust
to natural and adversarial corruptions with less task-recency bias. Additionally, we
provide theoretical insights to support our design decisions further.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7749
Loading