Resurfacing the Instance-only Dependent Label Noise Model through Loss Correction

ICLR 2026 Conference Submission25088 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: label noise, loss correction, instance-dependence, risk equivalence
TL;DR: We resurrect the instance-only dependent label noise model via loss correction that connects the empirical-noisy-risk with the true-clean-risk.
Abstract: We investigate the label noise problem in supervised binary classification settings and resurface the underutilized instance-_only_ dependent noise model through loss correction. On the one hand, based on risk equivalence, the instance-aware loss correction scheme completes the bridge from _empirical noisy risk minimization_ to _true clean risk minimization_ provided the base loss is classification calibrated (e.g., cross-entropy). On the other hand, the instance-only dependent modeling of the label noise at the core of the correction enables us to estimate a single value per instance instead of a matrix. Furthermore, the estimation of the transition rates becomes a very flexible process, for which we offer several computationally efficient ways. Empirical findings over different dataset domains (image, audio, tabular) with different learners (neural networks, gradient-boosted machines) validate the promised generalization ability of the method.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 25088
Loading