Enhancing Communication Compression via Discrepancy-aware Calibration for Federated Learning

Published: 26 Jan 2026, Last Modified: 12 Feb 2026ICLR 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning; Communication Compression
Abstract: Federated Learning (FL) offers a privacy-preserving paradigm for distributed model training by enabling clients to collaboratively learn a shared model without exchanging their raw data. However, the communication overhead associated with exchanging model updates remains a critical challenge, particularly for devices with limited bandwidth and battery resources. Existing communication compression methods largely rely on simple heuristics based on magnitude or randomness. For example, Top-k drops the elements with small magnitude, while low-rank methods such as ATOMO and PowerSGD truncate singular values with small magnitude. However, these rules do not account for the discrepancy between the compressed and the original outputs, which can lead to the loss of important information. To address this issue, we propose a novel discrepancy-aware communication compression method that enhances performance under severely constrained communication conditions. Each client uses a small subset of its local data as calibration data to directly measure the output discrepancy induced by dropping candidate compression units and uses it as a compression metric to guide the selection. By integrating this strategy, we can enhance existing mainstream compression schemes, enabling more efficient communication. Empirical results across multiple datasets and models show that our method achieves a significant improvement in accuracy under stringent communication constraints, notably an $18.9\\%$ relative accuracy improvement at a compression ratio of $0.1$, validating its efficacy for scalable and communication-efficient FL. Our code is available at https://github.com/wzy1026wzy/Discrepancy-aware-Compression-for-FL.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 9520
Loading