Online Continual Learning via Pursuing Class-conditional Funtion

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Online Continual Learning, Class-incremental Learning, Inter-class Imbalance
Abstract: Online continual learning is a challenging problem where models must learn from a non-stationary data stream while avoiding catastrophic forgetting. Inter-class imbalance during training has been identified as a major cause of forgetting, leading to model prediction bias towards recently learned classes. In this paper, we theoretically analyze that inter-class imbalance is entirely attributed to imbalanced class-priors, and the class-conditional function learned from intra-class distributions is the Bayes-optimal classifier. Accordingly, we present that a simple adjustment of model logits during training can effectively resist prior class bias and grasp the corresponding Bayes-optimum. Our method mitigates the impact of inter-class imbalance not only in class-incremental but also in realistic general setups by eliminating class-priors and pursuing class-conditionals, with minimal additional computational cost. We thoroughly evaluate our approach on various benchmarks and demonstrate significant performance improvements compared to prior arts. For example, our approach improves the best baseline by 4.6\% on CIFAR10.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6729
Loading