Learning with Instance-Dependent Noisy Labels by Hard Sample Selection with Anchor Hallucination

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Noisy label learning, semi-supervised learning
Abstract: Learning from noisily-labeled data is common in real-world visual learning tasks. Mainstream Noisy-Label Learning (NLL) methods mainly focus on sample-selection approaches, which typically divide the training dataset into clean and noisy subsets according to the loss distribution of samples. However, they overlook the fact that clean samples with complex visual patterns may also yield large losses, especially for datasets with Instance-Dependent Noise (IDN), in which the probability of an image being mislabeled depends on its visual appearance. This paper extends this idea and distinguishes complex samples from noisy ones. Specifically, we first select training samples with small initial losses to form an *easy* subset, where these easy samples are assumed to contain simple patterns with correct labels. The remaining samples either have complex patterns or incorrect labels, forming a *hard* subset. Subsequently, we utilize the easy subset to hallucinate multiple anchors, which are used to select hard samples to form a *clean hard* subset. We further exploit samples from these subsets following a semi-supervised training scheme to better characterize the decision boundary. Extensive experiments on synthetic and real-world instance-dependent noisy datasets show that our method outperforms the State-of-The-Art NLL methods.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5868
Loading