Connection-Adaptive Meta-LearningDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Meta learning, NAS, Fast adaptation
Abstract: Meta-learning enables models to adapt to new environments rapidly with a few training examples. Current gradient-based meta-learning methods concentrate on finding good initialization (meta-weights) for learners, but ignore the impact of neural architectures. In this paper, we aim to obtain better meta-learners by co-optimizing the architecture and meta-weights simultaneously. Existing NAS-based methods apply a two-stage strategy,i.e., first searching architectures and then re-training meta-weights for the searched architecture. However, this two-stage strategy would lead to a suboptimal meta-learner, since the meta-weights are overlooked during searching architectures for meta-learning. Differently, we propose a more efficient and effective method for meta-learning, namely Connection-Adaptive Meta-learning (CAML), which jointly searches architectures and train the meta-weights on consolidated connections. During searching, we consolidate the architecture connections layer by layer, in which the layer with the largest weight value would be fixed first. With searching only once, our CAML is able to obtain both adaptive architecture and meta-weights fo meta-learning. Extensive experiments show that CAML achieves state-of-the-art performance with 130x less computational cost, revealing our method’s effectiveness and efficiency.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: We propose Connection-Adaptive Meta-learning, which jointly searches architectures and trains the meta-weights on consolidated connections.
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=2MKFVS1gy
9 Replies

Loading