Bayesian Meta Sampling for Fast Uncertainty AdaptationDownload PDF

Published: 20 Dec 2019, Last Modified: 05 May 2023ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We proposed a Bayesian meta sampling method for adapting the model uncertainty in meta learning
Abstract: Meta learning has been making impressive progress for fast model adaptation. However, limited work has been done on learning fast uncertainty adaption for Bayesian modeling. In this paper, we propose to achieve the goal by placing meta learning on the space of probability measures, inducing the concept of meta sampling for fast uncertainty adaption. Specifically, we propose a Bayesian meta sampling framework consisting of two main components: a meta sampler and a sample adapter. The meta sampler is constructed by adopting a neural-inverse-autoregressive-flow (NIAF) structure, a variant of the recently proposed neural autoregressive flows, to efficiently generate meta samples to be adapted. The sample adapter moves meta samples to task-specific samples, based on a newly proposed and general Bayesian sampling technique, called optimal-transport Bayesian sampling. The combination of the two components allows a simple learning procedure for the meta sampler to be developed, which can be efficiently optimized via standard back-propagation. Extensive experimental results demonstrate the efficiency and effectiveness of the proposed framework, obtaining better sample quality and faster uncertainty adaption compared to related methods.
Keywords: Bayesian Sampling, Uncertainty Adaptation, Meta Learning, Variational Inference
Code: [![github](/images/github_icon.svg) zheshiyige/meta-sampling](https://github.com/zheshiyige/meta-sampling)
Data: [CIFAR-10](https://paperswithcode.com/dataset/cifar-10)
Original Pdf: pdf
10 Replies

Loading