IMEX-Reg: Implicit-Explicit Regularization in the Function Space for Continual Learning

Published: 02 May 2024, Last Modified: 28 May 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Continual learning (CL) remains one of the long-standing challenges for deep neural networks due to catastrophic forgetting of previously acquired knowledge. Although rehearsal-based approaches have been fairly successful in mitigating catastrophic forgetting, they suffer from overfitting on buffered samples and prior information loss, hindering generalization under low-buffer regimes. Inspired by how humans learn using strong inductive biases, we propose \textbf{IMEX-Reg} to improve the generalization performance of experience rehearsal in CL under low buffer regimes. Specifically, we employ a two-pronged implicit-explicit regularization approach using contrastive representation learning (CRL) and consistency regularization. To further leverage the global relationship between representations learned using CRL, we propose a regularization strategy to guide the classifier toward the activation correlations in the unit hypersphere of the CRL. Our results show that IMEX-Reg significantly improves generalization performance and outperforms rehearsal-based approaches in several CL scenarios. It is also robust to natural and adversarial corruptions with less task-recency bias. Additionally, we provide theoretical insights to support our design decisions further.
Submission Length: Regular submission (no more than 12 pages of main content)
Video: https://www.youtube.com/watch?v=X1Qh_Czx-NM
Code: https://github.com/NeurAI-Lab/IMEX-Reg
Assigned Action Editor: ~Ahmad_Beirami1
Submission Number: 2154
Loading