RACE: Robust adaptive and clustering elimination for noisy labels in continual learning

Published: 01 Jan 2025, Last Modified: 17 Jul 2025Knowl. Based Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Continual learning (CL) aims to incrementally acquire and retain knowledge from sequential tasks while mitigating catastrophic forgetting. However, the presence of noisy labels in real-world data streams severely degrades model performance by propagating biased knowledge across tasks. Existing replay-based methods face limitations in resource-constrained and privacy-sensitive scenarios. To address this, we propose RACE, a novel replay-free framework that integrates noise-robust adaptive fine-tuning and clustering-based label correction to suppress label noise in CL dynamically. The proposed method operates in two synergistic stages: noise-robust adaptive fine-tuning and noise elimination via clustering. The former designed an adaptive cross-entropy (ACE) loss that dynamically weights samples based on prediction confidence. The latter eliminates label noise via clustering by leveraging the pre-trained model’s feature representations. Additionally, to address varying noise levels across tasks, we design a dynamic learning strategy for RACE, enabling flexible balancing of the CL model’s learning performance and efficiency in each task. Extensive experiments on both synthetic noisy datasets (CIFAR-10 and CIFAR-100) and real-world noisy datasets (WebVision and Clothing1M) demonstrate that RACE consistently outperforms existing state-of-the-art methods, particularly under high noise ratios.
Loading