Resource-Adaptive Federated Learning with All-In-One Neural CompositionDownload PDF

Published: 31 Oct 2022, Last Modified: 10 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: Federated Learning, System Heterogeneity
Abstract: Conventional Federated Learning (FL) systems inherently assume a uniform processing capacity among clients for deployed models. However, diverse client hardware often leads to varying computation resources in practice. Such system heterogeneity results in an inevitable trade-off between model complexity and data accessibility as a bottleneck. To avoid such a dilemma and achieve resource-adaptive federated learning, we introduce a simple yet effective mechanism, termed All-In-One Neural Composition, to systematically support training complexity-adjustable models with flexible resource adaption. It is able to efficiently construct models at various complexities using one unified neural basis shared among clients, instead of pruning the global model into local ones. The proposed mechanism endows the system with unhindered access to the full range of knowledge scattered across clients and generalizes existing pruning-based solutions by allowing soft and learnable extraction of low footprint models. Extensive experiment results on popular FL benchmarks demonstrate the effectiveness of our approach. The resulting FL system empowered by our All-In-One Neural Composition, called FLANC, manifests consistent performance gains across diverse system/data heterogeneous setups while keeping high efficiency in computation and communication.
TL;DR: To cope with unaligned client capacity in federated learning, we propose All-In-One Neural Composition to enable unhindered access to the knowledge scattered across heterogeneous devices.
Supplementary Material: pdf
12 Replies

Loading