GCFed: Exploiting Gradient Correlation for Client Selection and Rate Allocation in Federated Learning
Abstract: Federated Learning (FL) has gained increasing popularity for its ability to harness diverse datasets from multiple sources without the need for data centralization. Extensive research has focused on reducing the cost of communications between remote clients and the parameter server. However, existing works fail to comprehensively leverage the correlation among the gradients at the remote clients. In this work, we propose GCFed -- a novel FL framework that exploits the clients' gradient correlation to reduce communication costs while maintaining satisfactory convergence. Specifically, we propose an information-theoretic problem formulation that considers the model update problem in a single FL iteration as a multi-terminal source coding problem in the context of rate-distortion theory. We solve the associated optimization problem using convex semidefinite relaxation techniques with an iterative algorithm and leverage the solution to develop a joint approach for correlation-aware client selection and rate allocation. Extensive experiments are conducted to validate the effectiveness of our proposed framework and approaches as compared to state-of-the-art methods. Our code is available at: https://anonymous.4open.science/r/GCFed-D03B
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=EBiUIKT3Yy¬eId=EBiUIKT3Yy
Changes Since Last Submission: Removed latex command in title and abstract
Assigned Action Editor: ~Gang_Niu1
Submission Number: 4936
Loading