Multi-Subspace Structured Meta-LearningDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: meta-learning
Abstract: Meta-learning aims to extract meta-knowledge from historical tasks to accelerate learning on new tasks. A critical challenge in meta-learning is to handle task heterogeneity, i.e., tasks lie in different distributions. Unlike typical meta-learning algorithms that learn a globally shared initialization, recent structured meta-learning algorithms formulate tasks into multiple groups and learn an initialization for tasks in each group using centroid-based clustering. However, those algorithms still require task models in the same group to be close together and fail to take advantage of negative correlations between tasks. In this paper, task models are formulated into a subspace structure. We propose a MUlti-Subspace structured Meta-Learning (MUSML) algorithm to learn the subspace bases. We establish the convergence and analyze the generalization performance. Experimental results confirm the effectiveness of the proposed MUSML algorithm.
One-sentence Summary: We propose a new meta-learning algorithm to learn multiple subspaces for task models.
Supplementary Material: zip
5 Replies

Loading