FedGP: Buffer-based Gradient Projection for Continual Federated Learning

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: continual federated learning, catastrophic forgetting, gradient projection
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Continual Federated Learning (CFL) is essential for enabling real-world applications where multiple decentralized clients adaptively learn from continuous data streams. A significant challenge in CFL is mitigating catastrophic forgetting, where models lose previously acquired knowledge when learning new information. Existing works on this issue either make unrealistic assumptions about the availability of task boundaries or heavily rely on surrogate samples. To address this gap, we introduce a buffer-based Gradient Projection method (\ours{}). This method tackles catastrophic forgetting by leveraging local buffer samples and aggregated buffer gradients, thus preserving knowledge across multiple clients. Our method is compatible with various existing continual learning and CFL techniques, enhancing their performance in the CFL context. Our experiments on standard benchmarks consistently show performance improvements across diverse scenarios. For example, on a task-incremental learning setting with CIFAR100, our method can help increase the accuracy up to 27\%. Our code is available at https://anonymous.4open.science/r/FedGP-F8D4.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: pdf
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5562
Loading