Efficient Federated Learning via Variational Dropout

Wei Du, Xiao Zeng, Ming Yan, Mi Zhang

Sep 27, 2018 ICLR 2019 Conference Withdrawn Submission readers: everyone
  • Abstract: As an emerging field, federated learning has recently attracted considerable attention. Compared to distributed learning in the datacenter setting, federated learning has more strict constraints on computate efficiency of the learned model and communication cost during the training process. In this work, we propose an efficient federated learning framework based on variational dropout. Our approach is able to jointly learn a sparse model while reducing the amount of gradients exchanged during the iterative training process. We demonstrate the superior performance of our approach on achieving significant model compression and communication reduction ratios with no accuracy loss.
  • Keywords: federated learning, communication efficient, variational dropout, sparse model
  • TL;DR: a joint model and gradient sparsification method for federated learning
0 Replies

Loading