Abstract: The performance of deep neural networks scales with dataset size and label quality, rendering the efficient mitigation of low-quality data annotations crucial for building robust and cost-effective systems. Existing strategies to address label noise exhibit severe limitations due to computational complexity and application dependency. In this work, we propose WANN, a Weighted Adaptive Nearest Neighbor approach that builds on self-supervised feature representations obtained from foundation models. To guide the weighted voting scheme, we introduce a reliability score $\eta$, which measures the likelihood of a data label being correct. WANN outperforms reference methods, including a linear layer trained with robust loss functions, on diverse datasets of varying size and under various noise types and severities. WANN also exhibits superior generalization on imbalanced data compared to both Adaptive-NNs (ANN) and fixed k-NNs. Furthermore, the proposed weighting scheme enhances supervised dimensionality reduction under noisy labels. This yields a significant boost in classification performance with 10x and 100x smaller image embeddings, minimizing latency and storage requirements. Our approach, emphasizing efficiency and explainability, emerges as a simple, robust solution to overcome inherent limitations of deep neural network training.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: - Deanonymized the manuscript and added the Acknowledgments section.
- Adjusted figure/table placement and line breaks to improve readability after adding the header.
- Refined wording and made minor corrections for readability and clarity.
- Added reference to Appendix A-B in Section 4.
Code: https://github.com/francescodisalvo05/wann-noisy-labels
Assigned Action Editor: ~Anurag_Arnab1
Submission Number: 3956
Loading