Meta-Learning with Latent Embedding OptimizationDownload PDF

Published: 21 Dec 2018, Last Modified: 21 Apr 2024ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Gradient-based meta-learning techniques are both widely applicable and proficient at solving challenging few-shot learning and fast adaptation problems. However, they have practical difficulties when operating on high-dimensional parameter spaces in extreme low-data regimes. We show that it is possible to bypass these limitations by learning a data-dependent latent generative representation of model parameters, and performing gradient-based meta-learning in this low-dimensional latent space. The resulting approach, latent embedding optimization (LEO), decouples the gradient-based adaptation procedure from the underlying high-dimensional space of model parameters. Our evaluation shows that LEO can achieve state-of-the-art performance on the competitive miniImageNet and tieredImageNet few-shot classification tasks. Further analysis indicates LEO is able to capture uncertainty in the data, and can perform adaptation more effectively by optimizing in latent space.
Keywords: meta-learning, few-shot, miniImageNet, tieredImageNet, hypernetworks, generative, latent embedding, optimization
TL;DR: Latent Embedding Optimization (LEO) is a novel gradient-based meta-learner with state-of-the-art performance on the challenging 5-way 1-shot and 5-shot miniImageNet and tieredImageNet classification tasks.
Code: [![github](/images/github_icon.svg) deepmind/leo](https://github.com/deepmind/leo) + [![Papers with Code](/images/pwc_icon.svg) 4 community implementations](https://paperswithcode.com/paper/?openreview=BJgklhAcK7)
Data: [ImageNet](https://paperswithcode.com/dataset/imagenet), [mini-Imagenet](https://paperswithcode.com/dataset/mini-imagenet), [tieredImageNet](https://paperswithcode.com/dataset/tieredimagenet)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:1807.05960/code)
17 Replies

Loading