everyone
since 13 Oct 2023">EveryoneRevisionsBibTeX
Class Incremental Learning (CIL) has gained significant attention in recent years due to its potential to adaptively learn from a non-stationary data distribution. The challenge of CIL primarily revolves around the model's ability to learn new classes without forgetting previously acquired knowledge. Recent research trends has achieved significant milestones, yet the continuity of learning can be further strengthened by integrating the concepts of "self-training", "out-of-distribution", and "data drift". In this paper, we propose a novel approach that integrates "Continual Learning", "Self-Training", "Out-of-Distribution recognition", and "Data Drift" concepts to advance the capabilities of class incremental learning systems. Drawing inspiration from works such as "A Theoretical Study on Solving Continual Learning", and "CSI: Novelty Detection via Contrastive Learning on Distributionally Shifted Instances". We propose a model that satisfies the four concepts mentioned above. Our experimental results demonstrate the efficacy of this method in mitigating catastrophic forgetting and ensuring consistent performance across a diverse range of classes.