Keywords: Few-Shot Continual Learning; Class-Incremental Learning
TL;DR: This paper introduces singular value fine-tuning for class-incremental learning.
Abstract: Class-Incremental Learning (CIL) aims to learn knowledge from new classes sequentially, while rataining the knowledge obtained from previously encountered classes, thereby mitigating the challenge of Catastrophic Forgetting. In a more realistic scenario, future unseen classes may contain only a few samples, leading to a new challenge of over-fitting, which is referred to as Few-Shot Class-Incremental Learning (FSCIL). Existing works explore FSCIL from various perspectives, such as classifier calibration and backbone extension. Most of them treat the many-shot base session and incremental few-shot sessions separately, as the model tends to overfit on few-shot classes. In this paper, we propose Singular Value Fine-tuning for few-shot Class-incremental Learning (SVFCL) to constantly learn base and incremental sessions based on the pre-trained ViT encoder. SVFCL incorporates incremental adapters, each of which is attached to a corresponding pre-trained module and contains only a small number of learnable parameters, effectively reducing the risk of overfitting. Furthermore, since each adapter is task-specific, information from previous tasks is well-preserved, mitigating catastrophic forgetting.
Our experimental results demonstrate that SVFCL achieves substantial improvements over state-of-the-art methods while requiring significantly less computational overhead and epochs.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8667
Loading