Efficiently Meta-Learning for Robust Deep Networks without Prior Unbiased SetDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Robust deep learning, Noisy Label, Meta-learning, KD
Abstract: Learning with noisy labels is a practically challenging problem in robust deep learning. Recent efforts to improve the robustness are made by meta-learning the sample weights or transition matrix from a prior unbiased set. Thus, previous meta-learning based approaches generally assume the existence of such prior unbiased set. Unfortunately, this assumption unrealistically simplifies the task of learning noisy labels in real-world scenarios; even worse the updating iterations in previous meta-learning algorithms typically demand prohibitive computational cost. This paper proposes an efficient meta-learning approach for robust deep learning to address these challenges. Specifically, without relying on prior unbiased validation set, our method dynamically estimates unbiased samples in training data and leverages meta-learning to refine the deep networks. Furthermore, to significantly reduce the updating iterations in optimization cost, we elaborately design the inner loop adaption and outer loop optimization of the meta-learning paradigm, respectively. Experimental results demonstrate that our approach is able to save about 6 times training time while achieving comparable or even better generalization performance. In particular, we improve accuracy on the CIFAR100 benchmark at 40% instance-dependent noise by more than 13% in absolute accuracy.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: General Machine Learning (ie none of the above)
TL;DR: We present an efficiently meta-learning approach, which eliminates the dependence on additional unbiased data and reduces the optimization complexity of recent meta-learning based method
Supplementary Material: zip
9 Replies

Loading