Local Learning Matters: Rethinking Data Heterogeneity in Federated LearningDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Federated Learning
Abstract: Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices). However, the data distribution among clients is often non-IID in nature, making efficient optimization difficult. To alleviate this issue, many FL algorithms focus on mitigating the effects of data heterogeneity across clients by introducing a variety of proximal terms, some incurring considerable compute and/or memory overheads, to restrain local updates with respect to the global model. Instead, we consider rethinking solutions to data heterogeneity in FL with a focus on local learning generality rather than proximal restriction. Inspired by findings from generalization literature, we employ second-order information to better understand algorithm effectiveness in FL, and find that in many cases standard regularization methods are surprisingly strong performers in mitigating data heterogeneity effects. Armed with key insights from our analysis, we propose a simple and effective method, FedAlign, to overcome data heterogeneity and the pitfalls of previous methods. FedAlign achieves comparable accuracy with state-of-the-art FL methods across a variety of settings while minimizing computation and memory overhead.
One-sentence Summary: We study the data heterogeneity challenge of federated learning from the perspective of local learning generality, provide unique insights, and propose an effective method based on our findings.
4 Replies

Loading