Keywords: Prarameter-free optimization, federated learning, compressed communication
Abstract: This paper addresses the critical challenges of hyperparameter tuning and communication efficiency in federated learning (FL). Despite recent advancements in parameter-free FL algorithms such as PAdaMFed, significant communication overhead remains a major obstacle to their practical deployment. To tackle these challenges, we propose a novel communication-efficient parameter-free FL algorithm ParFreFL that halves the communication requirements of PAdaMFed while preserving its parameter-free property. Building on this foundation, we introduce a compressed variant, ComParFreFL, which unifies the momentum increment and error feedback into a single parameter, effectively handling biased compression while maintaining the minimal communication cost. Notably, ComParFreFL also operates independent of the compression ratio, representing the first instance of such robustness in the compressed FL literature to our knowledge. Theoretically, our methods are proven to handle arbitrary data heterogeneity, partial client participation, and achieve linear speedup with respect to both local updates and participating clients. Extensive empirical evaluations demonstrate that our approaches match or surpass the performance of carefully tuned alternatives while significantly reducing communication overhead, making FL more accessible and deployable in dynamic, resource-constrained environments.
Primary Area: optimization
Submission Number: 12030
Loading