Keywords: Federated Learning, Intermittently connected links, Topology-induced variance, Client collaboration, Semi-decentralized, Distributed Mean Estimation
TL;DR: We study federated learning algorithms over unreliable networks with intermittent connections that can randomly fail. Our proposed semi-decentralized strategy, Collaborative Relaying, mitigates this by utilizing edge collaboration between clients.
Abstract: Intermittent connectivity of clients to the parameter server (PS) is a major bottleneck in federated edge learning. It induces a large generalization gap, especially when the local data distribution amongst clients exhibits heterogeneity. To overcome communication blockages between clients and the central PS, we have introduced the concept of collaborative relaying (ColRel), wherein the participating clients relay their neighbors' local updates to the PS in order to boost the participation of clients with poor connectivity to the PS. For every communication round, each client initially computes a local consensus of a subset of its neighboring clients' updates and subsequently transmits to the PS, a weighted average of its own update and those of its neighbors'. In this work, we view ColRel as a variance reduction technique that helps in improving the convergence rate for different optimization setups. Consequently, our ColRel approach can be readily integrated as a black box with existing federated learning systems. We provide analytical upper bounds on the resulting convergence rate, which we reduce by optimizing the weights subject to an unbiasedness condition for the global update. Numerical evaluations on the CIFAR-10 dataset demonstrate that our ColRel-based approach achieves a higher test accuracy over Federated Averaging based benchmarks for learning over intermittently-connected networks.
Is Student: Yes