Keywords: Prototype Consolidation, Catastrophic Forgetting, Incremental Learning, Mixture-of-Experts
Abstract: Exemplar-free Class Incremental Learning requires the learning agent to incrementally acquire new class information and maintain past knowledge without having access to samples from previous tasks. Despite the significant performance achieved by the subspace ensemble of a mixture of experts (MoE) with Gaussian prototypical networks, a critical gap still exists. As the downstream tasks arrive, the subspace representation of old classes gets updated, resulting in a prototype drift and leading to forgetting. To address the forgetting problem, we propose ProCEED to dynamically realign previous classes' representation in the latest subspace to adjust the drifted class prototypes and preserve their decision boundaries. Specifically, we compute the inter-subspace angular drifts of the prototype of previous incremental stages with the current one, holding the local semantic relationship between the incremental subspaces. The angular drift is then used to adjust old tasks' prototypes into the subspace of incremental tasks. Furthermore, the model inherits combined knowledge from MoE, supporting plasticity without extra computational burden. Consequently, ProCEED significantly balances the stability-plasticity dilemma over incoming incremental tasks, allowing the model to learn continually. The experimental evaluations on challenging benchmark datasets demonstrate dominant accuracy for ProCEED compared to the state-of-the-art class-incremental learning methods.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3122
Loading