A Nested Bi-level Optimization Framework for Robust Few Shot LearningDownload PDF

Sep 30, 2021 (edited Dec 10, 2021)NeurIPS 2021 Workshop MetaLearn PosterReaders: Everyone
  • Keywords: meta-learning, robust few-shot learning, MAML
  • TL;DR: We propose a novel nested bi-level optimization framework to assign weights to training tasks or instances in the presense of outliers (OOD tasks or noisy samples) during meta-training phase.
  • Abstract: Model-Agnostic Meta-Learning (MAML), a popular gradient-based meta-learning framework, assumes that the contribution of each task or instance to the meta-learner is equal. Hence, it fails to address the domain shift between base and novel classes in few-shot learning. In this work, we propose a novel robust meta-learning algorithm, NESTEDMAML, which learns to assign weights to training tasks or instances. We consider weights as hyper-parameters and iteratively optimize them using a small set of validation tasks set in a nested bi-level optimization approach (in contrast to the standard bi-level optimization in MAML). We then apply NESTEDMAML in the meta-training stage, which involves (1) several tasks sampled from a distribution different from the meta-test task distribution, or (2) some data samples with noisy labels. Extensive experiments on synthetic and real-world datasets demonstrate that NESTEDMAML efficiently mitigates the effects of "unwanted" tasks or instances, leading to significant improvement over the state-of-the-art robust meta-learning methods.
  • Contribution Process Agreement: Yes
  • Poster Session Selection: Poster session #2 (16:50 UTC+1)
0 Replies

Loading