Don’t Memorize; Mimic The Past: Federated Class Incremental Learning Without Episodic Memory

Published: 19 Jun 2023, Last Modified: 21 Jul 2023FL-ICML 2023EveryoneRevisionsBibTeX
Keywords: federated learning, class incremental learning, generative models, data-free, continual learning
TL;DR: This work presents a federated class incremental learning framework that utilizes generative models to synthesize data instead of episodic memory
Abstract: Deep learning models are prone to forgetting information learned in the past when trained on new data. This problem becomes even more pronounced in the context of Federated Learning (FL), where data is decentralized and subject to independent changes for each user. Continual Learning (CL) studies this so-called **catastrophic forgetting** phenomenon primarily in centralized settings, where the learner has direct access to the complete training dataset. However, applying CL techniques to FL is not straightforward due to privacy concerns and resource limitations. This paper presents a framework for federated class incremental learning that utilizes a generative model to synthesize samples from past distributions instead of storing part of past data. Then, clients can leverage the generative model to mitigate catastrophic forgetting locally. To preserve privacy, the generative model is trained on the server using data-free methods at the end of each task without requesting data from clients. We demonstrate significant improvements for the CIFAR-100 dataset compared to existing baselines.
Submission Number: 74
Loading