FedEWA: Federated Learning with Elastic Weighted AveragingDownload PDFOpen Website

Published: 2022, Last Modified: 15 May 2023IJCNN 2022Readers: Everyone
Abstract: Federated Learning (FL) offers a novel distributed machine learning context whereby a global model is collaboratively learned through edge devices without violating data privacy. However, intrinsic data heterogeneity in the federated network can induce model heterogeneity, thus posing a great challenge to the server-side model aggregation performance. Existing FL algorithms widely adopt model-wise weighted averaging for client models to generate the new global model, which emphasizes the importance of the holistic model but ignores the importance of distinctions between internal parameters of various client models. In this paper, we propose a novel parameter-wise elastic weighted averaging aggregation approach to realize the rapid fusion of heterogeneous client models. Specifically, each client evaluates the importance of model internal parameters in the model update and obtains the corresponding parameter importance coefficient vector; the server implements the parameter-wise weighted averaging for each parameter based on their importance coefficient vectors, thereby aggregating a new global model. Extensive experiments on MNIST and CIFAR-10 datasets with diverse network architectures and hyper-parameter combinations show that our proposed algorithm outperforms the existing state-of-the-art FL algorithms on the performance of heterogeneous model fusion.
0 Replies

Loading