Efficient Federated Learning with Smooth Aggregation for Non-IID Data from Multiple Edges

Published: 01 Jan 2024, Last Modified: 28 Jul 2025ICASSP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning (FL) learns an optimal global model by aggregating local models trained on distributed data from different devices. Due to heterogeneous data distributions across devices, local models will be divergent, resulting in the global model’s performance degradation. Recent studies attempt to balance local models to obtain a global model that can adapt to each device. But they ignore a more challenging problem that redundant local models from devices will break the balance, resulting in the global model overfitting redundant local models. Therefore, we propose FedSmooth, a novel global aggregation algorithm. FedSmooth first identifies the redundant local models without sensitive local information (e.g., label distribution), then designs a smooth global aggregation to strengthen the effect of local models that can accelerate finding the optimal global model. Experimental results show that our method outperforms 4 SOTA baseline methods even if there is more redundancy.
Loading