Sign-MAML: Efficient Model-Agnostic Meta-Learning by SignSGDDownload PDF

Published: 10 Dec 2021, Last Modified: 12 Mar 2024NeurIPS 2021 Workshop MetaLearn PosterReaders: Everyone
Keywords: Meta Learning, Bilevel Optimization, MAML
TL;DR: We interpret Model-Agnostic Meta-Learning (MAML) as a bilevel optimization problem (BLO) and leverage the sign-based SGD (signSGD) as a lower-level optimizer of BLO to come up with a new computationally-efficient first-order algorithm for MAML.
Abstract: We propose a new computationally-efficient first-order algorithm for Model-Agnostic Meta-Learning (MAML). The key enabling technique is to interpret MAML as a bilevel optimization (BLO) problem and leverage the sign-based SGD (signSGD) as a lower-level optimizer of BLO. We show that MAML, through the lens of signSGD-oriented BLO, naturally yields an alternating optimization scheme that just requires first-order gradients of a learned meta-model. We term the resulting MAML algorithm Sign-MAML. Compared to the conventional first-order MAML (FO-MAML) algorithm, Sign-MAML is theoretically-grounded as it does not impose any assumption on the absence of second-order derivatives during meta training. In practice, we show that Sign-MAML outperforms FO-MAML in various few-shot image classification tasks, and compared to MAML, it achieves a much more graceful tradeoff between classification accuracy and computation efficiency.
Contribution Process Agreement: Yes
Poster Session Selection: Poster session #2 (16:50 UTC+1)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2109.07497/code)
0 Replies

Loading