Softmax-Weighted Pseudo-Label Refinement for Enhancing Robustness to Label Noise

12 Apr 2025 (modified: 12 Apr 2025)MIDL 2025 Short Papers SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Instance-dependent label noise, Self-supervised learning, Pseudo-label refine- ment, Dynamic loss weighting
TL;DR: We propose a hybrid framework that combines contrastive self-supervised pretraining with iterative pseudo-label refinement to mitigate instance-dependent label noise.
Abstract: Deep neural networks often require large-scale, accurately labelled datasets to perform well, but in practice, the labels are frequently corrupted by noise in medical imaging - especially instance-dependent noise. In this work, we propose a novel framework to address instance-dependent label noise by integrating three key components: (i) self-supervised pretraining using SimCLR to learn robust, noise-agnostic feature representations; (ii) an iterative pseudo-label refinement strategy employing a stage-wise consensus mechanism to progressively correct mislabeled samples; and (iii) a softmax-weighted cross-entropy loss that dynamically downweights uncertain predictions. We validate our approach on benchmark datasets such as CIFAR-10 and CIFAR-100 corrupted with synthetic noise at 20% and 30% levels, demonstrating significant improvements over state-of-the-art methods. We further validated our method on Chest X-rays and Chaoyang medical imaging datasets.
Submission Number: 111
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview