LoRe - Logarithm Regularization for Few-Shot Class Incremental Learning

28 Sept 2024 (modified: 18 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Few-Shot Class Incremental Learning, Continual Learning, Logarithmic Regularization, Wide Minima
Abstract: Few-Shot Class-Incremental Learning (FSCIL) aims to adapt to new classes with very limited data, while remembering information about all the previously seen classes. Current FSCIL methods freeze the feature extractor in the incremental sessions to prevent catastrophic forgetting. However, to perform well on the incremental classes, many methods reserve feature spaces during base training to allow sufficient space for incremental classes. We hypothesize that such feature space reservation sharpens the minima of the loss-landscape, resulting in sub-optimal performance. Motivated by the superior generalization of wide minima, we propose LoRe - logarithm regularization to guide the model optimization to wider minima. Moreover, we propose a denoised distance metric when considering similarity with the poorly calibrated prototypes. Comprehensive evaluations across three benchmark datasets reveal that LoRe not only achieves state-of-the-art performance but also produces more robust prototypes. Additionally, we demonstrate that LoRe can be leveraged to enhance the performance of existing methods.
Supplementary Material: pdf
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13700
Loading