Robust Meta-learning with Sampling Noise and Label Noise via Eigen-ReptileDownload PDF

29 Sept 2021 (modified: 22 Oct 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: meta-learning, few-shot, noisy label
Abstract: Recent years have seen a surge of interest in meta-learning techniques for tackling the few-shot learning (FSL) problem. However, the meta-learner is prone to overfitting since there are only a few available samples with sampling noise on a clean dataset. More importantly, when handling the data sampled with noisy labels, meta-learner could be extremely sensitive to label noise on a corrupted dataset. To address these two challenges, we present Eigen-Reptile (ER) that updates the meta-parameters with the main direction of historical task-specific parameters to alleviate sampling and label noise. Specifically, the main direction is computed in a fast way for the required large-scale matrix. Furthermore, to obtain a more accurate main direction for Eigen-Reptile in the presence of label noise, we further propose Introspective Self-paced Learning (ISPL). We have theoretically and experimentally demonstrated the soundness and effectiveness of the proposed Eigen-Reptile and ISPL. Particularly, our experiments on different tasks show that the proposed method is able to outperform or achieve highly competitive performance compared with other gradient-based methods with or without noisy labels.
One-sentence Summary: We have theoretically and experimentally demonstrated the soundness and effectiveness of the proposed methods for few-shot learning with or without noisy labels.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2206.01944/code)
5 Replies

Loading