Efficient Federated Learning via Variational DropoutDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Withdrawn SubmissionReaders: Everyone
Abstract: As an emerging field, federated learning has recently attracted considerable attention. Compared to distributed learning in the datacenter setting, federated learning has more strict constraints on computate efficiency of the learned model and communication cost during the training process. In this work, we propose an efficient federated learning framework based on variational dropout. Our approach is able to jointly learn a sparse model while reducing the amount of gradients exchanged during the iterative training process. We demonstrate the superior performance of our approach on achieving significant model compression and communication reduction ratios with no accuracy loss.
Keywords: federated learning, communication efficient, variational dropout, sparse model
TL;DR: a joint model and gradient sparsification method for federated learning
5 Replies

Loading