Federated Sparse Training: Lottery Aware Model Compression for Resource Constrained EdgeDownload PDF

Published: 21 Oct 2022, Last Modified: 05 May 2023FL-NeurIPS 2022 PosterReaders: Everyone
Keywords: Sparse federated learning (FL), communication efficient FL, computation efficient FL
TL;DR: We present methodologies for sparse federated learning for resource constrained edge (both homogeneous and heterogeneous compute budget).
Abstract: Limited computation and communication capabilities of clients pose significant challenges in federated learning (FL) over resource-limited edge nodes. A potential solution to this problem is to deploy off-the-shelf sparse learning algorithms that train a binary sparse mask on each client with the expectation of training a consistent sparse server mask. However, as we investigate in this paper, such naive deployments result in a significant accuracy drop compared to FL with dense models, especially under clients' low resource budgets. In particular, our investigations reveal a serious lack of consensus among the trained masks on clients, which prevents convergence on the server mask and potentially leads to a substantial drop in model performance. Based on such key observations, we propose \textit{federated lottery aware sparsity hunting} (FLASH), a unified sparse learning framework to make the server win a lottery in terms of a sparse sub-model, which can greatly improve performance under highly resource-limited client settings. Moreover, to address the issue of device heterogeneity, we leverage our findings to propose \textit{hetero-FLASH}, where clients can have different target sparsity budgets based on their device resource limits. Extensive experimental evaluations with multiple models on various datasets (both IID and non-IID) show superiority of our models in yielding up to $\mathord{\sim}10.1\%$ improved accuracy with $\mathord{\sim}10.26\times$ fewer communication costs, compared to existing alternatives, at similar hyperparameter settings.
Is Student: Yes
4 Replies

Loading