Uncertainty in Multitask Transfer LearningDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Using variational Bayes neural networks, we develop an algorithm capable of accumulating knowledge into a prior from multiple different tasks. This results in a rich prior capable of few-shot learning on new tasks. The posterior can go beyond the mean field approximation and yields good uncertainty on the performed experiments. Analysis on toy tasks show that it can learn from significantly different tasks while finding similarities among them. Experiments on Mini-Imagenet reach state of the art with 74.5% accuracy on 5 shot learning. Finally, we provide two new benchmarks, each showing a failure mode of existing meta learning algorithms such as MAML and prototypical Networks.
Keywords: Multi Task, Transfer Learning, Hierarchical Bayes, Variational Bayes, Meta Learning, Few Shot learning
TL;DR: A scalable method for learning an expressive prior over neural networks across multiple tasks.
Data: [mini-Imagenet](https://paperswithcode.com/dataset/mini-imagenet)
8 Replies

Loading