On the Subspace Structure of Gradient-Based Meta-LearningDownload PDF

26 May 2022, 20:09 (modified: 23 Jul 2022, 02:25)ICML 2022 Pre-training WorkshopReaders: Everyone
Keywords: meta-learning, gradient-based meta-learning, parameter space, few-shot classification, dimensionality reduction
TL;DR: An investigation into the structure of the learnt task-adapted parameters of gradient-based meta-learning.
Abstract: In this work we provide an analysis of the distribution of the post-adaptation parameters of Gradient-Based Meta-Learning (GBML) methods. Previous work has noticed how, for the case of image-classification, this adaption only takes place on the last layers of the network. We propose the more general notion that parameters are updated over a low-dimensional subspace of the same dimensionality as the task-space and show that this holds for regression as well. Furthermore, the induced subspace structure provides a method to estimate the intrinsic dimension of the space of tasks of common few-shot learning datasets.
0 Replies