Keywords: meta-learning, few-shot learning
TL;DR: Bayesian meta-learning using PAC-Bayes framework and implicit prior distributions
Abstract: We introduce a new and rigorously-formulated PAC-Bayes few-shot meta-learning algorithm that implicitly learns a model prior distribution of interest. Our proposed method extends the PAC-Bayes framework from a single task setting to the few-shot meta-learning setting to upper-bound generalisation errors on unseen tasks. We also propose a generative-based approach to model the shared prior and task-specific posterior more expressively compared to the usual diagonal Gaussian assumption. We show that the models trained with our proposed meta-learning algorithm are well calibrated and accurate, with state-of-the-art calibration and classification results on mini-ImageNet benchmark, and competitive results in a multi-modal task-distribution regression.
Code: https://www.dropbox.com/s/fa0msqrr74psaej/SImBa_code.zip?dl=0
Original Pdf: pdf
4 Replies
Loading