Fedgac: optimizing generalization in personalized federated learning via adaptive initialization and strategic client selection

Published: 01 Jan 2025, Last Modified: 05 Jun 2025Clust. Comput. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning (FL) encounters significant challenges in heterogeneous client environments, due to statistical heterogeneity affecting the global model’s generalization. To address these issues, we introduce FedGAC, an efficient adaptive personalized FL (pFL) method focused on critical learning periods (CLP). Unlike existing methods that assume all training stages are equally critical and require participation from all clients, FedGAC strategically select clients during CLP to reduce computational and communication burdens. To enhance the global model’s generalization across statistically diverse clients, we integrate an Adaptive Initialization of Local Models (AILM) module with a personalized aggregation strategy. Training parameters are dynamically adjusted according to dataset quality, ensuring both accuracy and efficiency. Additionally, FedGAC employs parameter compression techniques to further reduce communication and computational costs. We validate the effectiveness of FedGAC through extensive experiments on four benchmark datasets in computer vision. The results demonstrate that FedGAC significantly enhances both accuracy and communication efficiency compared to state-of-the-art FL methods. This makes FedGAC a promising approach for addressing heterogeneity in pFL. The code is available at https://github.com/buaaYYC/FedGAC.git.
Loading