Keywords: continual learning, federated learning, catastrophic forgetting, gradient projection, realistic streaming automotive dataset
Abstract: Continual Federated Learning (CFL) has garnered significant attention in recent years due to its potential in real-world scenarios where multiple clients (e.g., mobile phones, autonomous vehicles) continuously observe data in a dynamic environment. However, CFL suffers from catastrophic forgetting, where the model forgets previously learned knowledge in favor of new data. To address this issue, we propose a novel method called buffer-based Gradient Projection (FedGP) that mitigates catastrophic forgetting by replaying local buffer samples and using aggregated buffer gradients to preserve the previously learned knowledge across clients. Our method can be combined with a variety of existing continual learning methods, and boost their performance in the CFL setup. Our approach is evaluated on both standard benchmark datasets and a realistic streaming decentralized automotive dataset generated using CARLA and OpenFL.
0 Replies
Loading