On Enforcing Better Conditioned Meta-Learning for Rapid Few-Shot AdaptationDownload PDF

Published: 31 Oct 2022, 18:00, Last Modified: 15 Oct 2022, 03:06NeurIPS 2022 AcceptReaders: Everyone
Keywords: Few-shot learning, meta-learning, condition number, preconditioning
TL;DR: Inspired by the concept of preconditioning, we propose a novel method to significantly increase adaptation speed for gradient-based meta-learning methods without incurring extra parameters.
Abstract: Inspired by the concept of preconditioning, we propose a novel method to increase adaptation speed for gradient-based meta-learning methods without incurring extra parameters. We demonstrate that recasting the optimisation problem to a non-linear least-squares formulation provides a principled way to actively enforce a well-conditioned parameter space for meta-learning models based on the concepts of the condition number and local curvature. Our comprehensive evaluations show that the proposed method significantly outperforms its unconstrained counterpart especially during initial adaptation steps, while achieving comparable or better overall results on several few-shot classification tasks – creating the possibility of dynamically choosing the number of adaptation steps at inference time.
Supplementary Material: pdf
17 Replies

Loading