CFL: Cluster Federated Learning in Large-Scale Peer-to-Peer Networks

12 May 2023OpenReview Archive Direct UploadReaders: Everyone
Abstract: High bandwidth, data privacy issues, and single point of failure require the development of Federated learning (FL) in large-scale peer-to-peer (P2P) networks. In this paper, we propose the first fine-grained global model training protocol, dubbed CFL, which is efficient and privacy-preserving. Rigorous analyses show that CFL guarantees the privacy and data integrity and authenticity of local model update parameters under two widespread threat models. Ingenious experiments on the Trec06p and Trec07 datasets show that the global model trained by CFL has good classification accuracy, rapid convergence rate, and dropout-robustness. Compared to the first global model training protocol for FL in P2P networks, CFL improves communication efficiency and computational efficiency.
0 Replies

Loading