Keywords: Unsupervised meta-learning, energy-based model, likelihood-based learning
TL;DR: We introduce meta-SVEBM to solve unsupervised meta-learning problems and get competitive results on miniImageNet and Omniglot.
Abstract: Meta-learning aims to learn a model from a stream of tasks such that the model is able to generalize across tasks and rapidly adapt to new tasks. We propose to learn an energy-based model (EBM) in the latent space of a top-down generative model such that the EBM in the low dimensional latent space is able to be learned efficiently and adapt to each task rapidly. Furthermore, the energy term couples a continuous latent vector and a symbolic one-hot label. Such coupling formulation allows the model to be learned in an unsupervised manner when the labels are unknown. Our model is learned unsupervisedly in the meta-training phase and evaluated semi-supervisedly in the meta-test phase. We evaluate our model on widely used benchmarks for few-shot meta-learning, Omniglot, and Mini-ImageNet. Our model achieves competitive or superior performance compared to previous state-of-the-art meta-learning models.
Contribution Process Agreement: Yes
Poster Session Selection: Poster session #1 (12:00 UTC), Poster session #2 (15:00 UTC)