GAML: geometry-aware meta-learning via a fully adaptive preconditionerDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Meta-learning, few-shot learning
Abstract: Model-Agnostic Meta-Learning (MAML) is one of the most successful meta-learning algorithms. It has a bi-level optimization structure, where the outer-loop process learns the shared initialization and the inner-loop process optimizes the task-specific weights. Although MAML relies on the standard gradient descent in the inner-loop, recent works have shown that it can be beneficial to control the inner loop's gradient descent with a meta-learned preconditioner. The existing preconditioners, however, cannot adapt in a task-specific and path-dependent way at the same time. Also, most of them do not consider the geometry of the loss surface. In this work, we propose Geometry-Aware Meta-Learning (GAML) that can overcome the limitations. GAML can efficiently meta-learn a preconditioner that is dependent on the task-specific parameters and its preconditioner can be shown to be a Riemannian metric that defines the geometry of the loss surface. Therefore, we can perform a fully-adaptive and geometry-aware optimization in the inner-loop. Experiment results show that GAML outperforms the state-of-the-art MAML family and PGD-MAML family for a variety of few-shot learning tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
7 Replies

Loading