Projection-Enhanced Contrastive Learning and Linear Calibration for Exemplar-Free Class-Incremental Learning

ICLR 2026 Conference Submission14836 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Exemplar-Free Class Incremental Learning, Prototype Calibration, Contrastive Learning
TL;DR: We propose a projection-enhanced contrastive learning strategy to learn richer class features and reduce feature congestion with old-class repulsion, and a closed-form similarity-weighted regression-based method for efficient prototype calibration.
Abstract: Exemplar-Free Class-Incremental Learning (EFCIL) tackles the challenge of learning to discriminate between new and old classes without retaining any past exemplars. Contrastive learning offers a promising direction to mitigate feature-space congestion in EFCIL, yet applying it directly in the classification feature space interferes with the objective of learning class-discriminative features, harming performance. We thus propose a novel EFCIL framework decoupling the contrastive learning and classification via a projection head in order to take advantage of the contrastive learning and preserve rich class-discriminative features in the pre-projection space. To further reduce congestion between old and new classes, we propose an old-class repulsion strategy directly on the pre-projection space. Additionally, we propose to eliminate the computation overhead incurred by current prototype calibration methods through a closed-form similarity-weighted linear regression update, enabling efficient yet effective adaptation of full prototype distributions. By integrating these three strategies, our proposed method outperforms existing state-of-the-art methods across several benchmarks. Code available at [https://anonymous.4open.science/r/iclr2026-D134](https://anonymous.4open.science/r/iclr2026-D134).
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 14836
Loading