Continual Learning for Long-Tailed Recognition

TMLR Paper2694 Authors

15 May 2024 (modified: 18 May 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We propose Continual Learning for Long-Tailed Recognition (CLTR), a framework that employs standard off-the-shelf continual Learning (CL) methods for addressing Long-Tailed Recognition (LTR) problems, by first learning the majority classes (Head) followed by learning of the minority classes (Tail), without forgetting the majority. To ensure that our method is theoretically sound, we first prove that training a model on long-tailed data leads to weights similar to training the same learner on the Head classes. This naturally necessitates another step where the model learns the Tail after the Head in a sequential manner. We then prove that employing CL can effectively mitigate catastrophic forgetting in this setup and thus improve the model's performance in addressing LTR. We evaluate the efficacy of our approach using several standard CL methods on multiple datasets (CIFAR100-LT, CIFAR10-LT, ImageNet-LT, and Caltech256), showing that CLTR achieves state-of-the-art performance on all the benchmarks. Further, we demonstrate the effectiveness of CLTR in the more challenging task of class-incremental LTR, surpassing the state-of-the-art methods in this area by notable margins. Lastly, extensive sensitivity analyses and detailed discussions are provided to further explore the underlying mechanisms of CLTR. Our work not only bridges LTR and CL in a systematic way, but also paves the way for leveraging future advances in CL methods to more effectively tackle LTR problems.
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=bJwkKZ2U8B&nesting=2&sort=date-desc
Changes Since Last Submission: We have carefully reviewed the feedback provided by the Action Editor and made the necessary adjustments to our submission. Specifically, we have ensured that our submission is now fully anonymized by removing the supplementary material that contained identifiable information. This change addresses the desk rejection due to the inclusion of the author’s name within the file path of the code provided. We have made no other modifications to the manuscript content itself, ensuring that the primary research and findings remain consistent with our original submission. We appreciate the opportunity to resubmit our work and look forward to the review process.
Assigned Action Editor: ~ERIC_EATON1
Submission Number: 2694
Loading