Variance-Reduced Meta-Learning via Laplace Approximation

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: meta-learning, multi-task learning, few-shot learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a new method to reduce the variance of gradient-based meta-learning models based by weighing the support points through their uncertainty.
Abstract: Meta-learning algorithms aim to learn a general prior over a set of related tasks to facilitate generalization to new, unseen tasks. This is achieved by estimating the optimal posterior using a finite set of support data. However, this estimation is subject to high variance due to the limited amount of support data for each task, which often leads to sub-optimal generalization performance. In this paper, we address the problem of variance reduction in gradient-based meta-learning and define a class of problems particularly prone to this. Specifically, we propose a novel approach that reduces the variance of the gradient estimate by weighing each support point individually by the variance of its posterior over the parameters. To estimate the posterior, we utilize the Laplace approximation, which allows us to express the variance in terms of the curvature of the loss landscape of our meta-learner. Experimental results demonstrate the effectiveness of the proposed method and highlight the importance of variance reduction in meta-learning.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5760
Loading