Keywords: continual learning, Bayesian learning, imprecise probability
TL;DR: We propose a new continual learning algorithm that efficiently generates models for any user-preferred stability-plasticity trade-off points.
Abstract: Algorithms that balance the stability-plasticity trade-off are well-studied in the continual learning literature. However, only a few of them focus on obtaining models for specified trade-off preferences. When solving the problem of continual learning under specific trade-offs (CLuST), state-of-the-art techniques leverage rehearsal-based learning, which requires retraining when a model corresponding to a new trade-off preference is requested. This is inefficient since there exist infinitely many different trade-offs, and a large number of models may be requested. As a response, we propose Imprecise Bayesian Continual Learning (IBCL), an algorithm that tackles CLuST efficiently. IBCL replaces retraining with constant-time convex combination. Given a new task, IBCL (1) updates the knowledge base in the form of a convex hull of model parameter distributions and (2) generates one Pareto-optimal model per given trade-off via convex combination without any additional training. That is, obtaining models corresponding to specified trade-offs via IBCL is zero-shot. Experiments whose baselines are current CLuST algorithms show that IBCL improves by at most 45\% on average per task accuracy and by 43\% on peak per task accuracy, while maintaining a near-zero to positive backward transfer. Moreover, its training overhead, measured by number of batch updates, remains constant at every task, regardless of the number of preferences requested. Details at: \url{https://github.com/ibcl-anon/ibcl}.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5851
Loading