Variational Federated Continual Learning

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Federated Continual Learning, Bayesian Neural Network, Variational Inference
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Federated continual learning (FCL) is an emerging learning paradigm with the potential to augment the scalability of federated learning by facilitating continual learning among multiple learners. However, FCL is beset by the significant challenges of local overfitting and catastrophic forgetting. To address both simultaneously, we propose Variational Federated Continual Learning (VFCL), a novel Bayesian neural network-based FCL framework that consists of two cores. First, we propose variational inference with mixture prior that merges global and local historical knowledge, which addresses local overfitting caused by the absence of global knowledge and catastrophic forgetting caused by the absence of historical knowledge simultaneously. Furthermore, to minimize the error in global knowledge acquisition, we present an effective global posterior aggregation method. Additionally, we provide a theoretical analysis on the upper bound of the generalization error of VFCL, which further helps to select the optimal hyperparameters. Empirical evaluations are conducted on VFCL, which outperforms other state-of-the-art methods on the widely used CIFAR100 and TinyImageNet datasets.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2499
Loading