Federated learning over physical channels: adaptive algorithms with near-optimal guarantees

Published: 24 Sept 2025, Last Modified: 18 Nov 2025AI4NextG @ NeurIPS 25 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: federated learning, noisy channels, stochastic optimization, non-asymptotic guarantees, ADC/DAC quantization
Abstract: In federated learning, communication cost can be significantly reduced by transmitting real-valued gradient information directly through physical channels. However, the bias induced by hardware quantization and large variance due to channel noise create significant challenges for convergence analysis and algorithm design. In this paper, we propose a new class of pre-coding and post-coding techniques to ensure exact unbiasedness and low variance of the transmitted stochastic gradient. Building upon these techniques, we design adaptive federated stochastic gradient descent (SGD) algorithms that can be implemented over physical channels for both downlink broadcasting and uplink transmission. We establish theoretical guarantees for the proposed algorithms, demonstrating convergence rates that are adaptive to the stochastic gradient noise level from data, without degradation due to channel noise. We also demonstrate the practical effectiveness of our algorithms through simulation studies with deep learning models. Our simulation results with CIFAR-10 and MNIST datasets show test accuracy matching that of the full-precision coded channel, costing only 20% of communication symbols.
Submission Number: 70
Loading