BiCompFL: Bi-Directional Compression for Stochastic Federated Learning

Published: 06 Jun 2025, Last Modified: 06 Jun 2025ICML Workshop on ML4WirelessEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Communication-efficiency, Importance sampling, Minimal Random Coding, Stochastic federated learning
Abstract: Federated Learning (FL) incurs high communication costs in both uplink and downlink. Prior work largely focuses on lossy compression of model updates in deterministic FL. In contrast, stochastic (Bayesian) FL considers distributions over parameters, enabling uncertainty quantification, improved generalization, and inherently communication-regularized training via a mirror-descent structure. We address both uplink and downlink communication in stochastic FL by proposing a framework based on remote source generation. Using Minimal Random Coding (MRC) for remote generation, the server and clients sample from global and local posteriors (sources), respectively, instead of transmitting locally sampled updates. The framework enables communication-regularized local optimization and principled model update compression, leveraging gradually updated priors as side information. Extensive experiments show that our method achieves a $5$–$32\times$ reduction in total communication while preserving accuracy. We refine MRC bounds to precisely quantify uplink and downlink trade-offs, and extend our approach to conventional FL via stochastic quantization and prove a contraction property for the biased MRC compressor to enable convergence analysis.
Submission Number: 18
Loading