Can Sequential Bayesian Inference Solve Continual Learning?Download PDF

Published: 29 Jan 2022, Last Modified: 05 May 2023AABI 2022 PosterReaders: Everyone
Keywords: Continual Learning, Sequential Bayesian Inference
TL;DR: We propagate the posterior of a BNN using HMC and a density estimator over samples for Bayesian CL. This did not work. So we re-evaluated the use of sequential Bayes in CL and provide an alternative approach which we name Prototypical Bayesian CL.
Abstract: Previous work in Continual Learning (CL) has used sequential Bayesian inference to prevent forgetting and accumulate knowledge from previous tasks. A limiting factor to performing Bayesian CL has been exact inference in a Bayesian Neural Network (NN). We perform sequential Bayesian inference with a Bayesian NN using Hamiltonian Monte Carlo (HMC) and propagate the posterior as a prior for a new task by fitting a density estimator on HMC samples. We find that this approach fails to prevent forgetting. We propose an alternative view of the CL problem which directly models the data generating process and decomposes the CL problem in task specific and shared parameters. This method named Prototypical Bayesian CL and performs well compared to the latest Bayesian CL methods.
Reviewer: skessler@robots.ox.ac.uk
1 Reply

Loading