Keywords: noisy label; classification; rubost deep learning
Abstract: Noisy labels are inevitable in real-world scenarios. Due to the strong capacity of deep neural networks to memorize corrupted labels, these noisy labels can cause significant performance degradation. Existing research on mitigating the negative effects of noisy labels has mainly focused on robust loss functions and sample selection, with comparatively limited exploration of regularization in model architecture. In this paper, we propose a Dynamic Connection Masking (DCM) mechanism for the widely-used Fully Connected Layer (FC) to enhance the robustness of classifiers against noisy labels. The mechanism can adaptively mask less important edges during training by evaluating their information-carrying capacity. Through this selective masking process of preserving only a few critical edges for information propagation, our DCM effectively reduces the gradient error caused by noisy labels. It can be seamlessly integrated into various noise-robust training methods to build more robust deep networks, including robust loss functions and sample selection strategies. Additionally, we validate the applicability of our DCM by extending it to the newly-emerged Kolmogorov-Arnold Network (KAN) architecture. The experimental results reveal that the KAN exhibits superior noise robustness over FC-based classifiers in real-world noisy scenarios. Extensive experiments on both synthetic and real-world benchmarks demonstrate that our method consistently outperforms state-of-the-art (SOTA) approaches. Code is available at https://anonymous.4open.science/r/DCM-0C0A.
Primary Area: optimization
Submission Number: 8260
Loading