Keywords: Federated Learning; Federated Continuous Learning; Personalized Federated Learning
Abstract: Recently, Federated Continuous Learning (FCL) has gained attention for simulating real-world dynamic problems, with catastrophic forgetting as its core challenge. While generative replay is widely used in FCL methods to mitigate this issue, higher cross-client data heterogeneity necessitates excessive FL rounds per task for convergence, thereby conflicting with clients' demand for immediate responses. To address this, we focus on real-time FCL, where incremental data arrives in small batches per FL round and is only accessible at that FL round, causing global data heterogeneity to vary across FL rounds, and propose pFedGRP, which includes two key components: Firstly, a flexible generative replay architecture that decouples the generator by category to mitigate inter-class catastrophic forgetting, combines with the task model to reduce redundant updates and improve generation quality, and adaptively adjusts client-specific local generation scales. Next, a personalized FCL framework via generative replay that optimizes aggregation weights on server-side for real-time model personalization, and transfers personalized knowledge to an extra average global model on client-side for catastrophic forgetting mitigation. Experiments show pFedGRP outperforms other FCL methods via generative replay, with both superior performance and lower regret.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 1872
Loading