Curriculum Meta-Learning for Few-shot ClassificationDownload PDF

Published: 10 Dec 2021, Last Modified: 12 Mar 2024NeurIPS 2021 Workshop MetaLearn PosterReaders: Everyone
Abstract: We propose an adaptation of the curriculum training framework, applicable to state-of-the-art meta learning techniques for few-shot classification. Curriculum-based training popularly attempts to mimic human learning by progressively increasing the training complexity to enable incremental concept learning. As the meta-learner's goal is learning how to learn from as few samples as possible, the exact number of those samples (i.e. the size of the support set) arises as a natural proxy of a given task's difficulty. We define a simple yet novel curriculum schedule that begins with a larger support size and progressively reduces it throughout training to eventually match the desired shot-size of the test setup. This proposed method boosts the learning efficiency as well as the generalization capability. Our experiments with the MAML algorithm on two few-shot image classification tasks show significant gains with the curriculum training framework. Ablation studies corroborate the independence of our proposed method from the model architecture as well as the meta-learning hyperparameters.
Contribution Process Agreement: Yes
Author Revision Details: We have incorporated the recommendations from reviewers on 1) improving the notations, 2) adding validation loss plot next to that of training one 3) justification on the curriculum design incl. the choice of learning rate annealing strategy.
Poster Session Selection: Poster session #1 (12:00 UTC), Poster session #2 (15:00 UTC)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:2112.02913/code)
0 Replies

Loading