Keywords: peer-to-peer computing, deep learning, federated learning
TL;DR: MAR-FL is a new peer-to-peer federated learning system that uses iterative group-based aggregation to reduce communication costs from O(n²) to O(n log n), making it more scalable and robust for wireless networks with unreliable participants.
Abstract: The convergence of next-generation wireless systems and distributed Machine Learning (ML) demands Federated Learning (FL) methods that remain efficient and robust with wireless connected peers and under network churn.
Peer-to-peer (P2P) FL removes the bottleneck of a central coordinator, but existing approaches suffer from excessive communication complexity, limiting their scalability in practice.
We introduce **MAR-FL**, a novel P2P FL system that leverages iterative group-based aggregation to substantially reduce communication overhead while retaining resilience to churn.
MAR-FL achieves communication costs that scale as $\mathcal{O}(N\log{N})$, contrasting with the $\mathcal{O}(N^2)$ complexity of previously existing baselines, and thereby maintains effectiveness especially as the number of peers in an aggregation round grows.
The system is robust towards unreliable FL clients and can integrate private computing.
Submission Number: 35
Loading