Keywords: federated learning, distributed learning, machine learning, optimization
TL;DR: We introduce Generalized Heavy-Ball Momentum, a method that guarantees convergence and improves robustness in federated learning with heterogeneous and unreliable data
Abstract: Reliable machine learning requires robustness to unreliable and heterogeneous data, a challenge that is particularly acute in Federated Learning (FL). Standard optimization methods degrade under the combined effects of data heterogeneity and partial client participation, while existing momentum variants introduce biased updates that undermine reliability.
We propose a novel Generalized Heavy-Ball Momentum (GHBM), a principled optimization method that eliminates this bias and provides convergence guarantees even under unbounded heterogeneity and cyclic participation. We further develop adaptive, communication-efficient variants that retain the efficiency of FedAvg. Extensive experiments on vision and language benchmarks confirm that GHBM substantially improves robustness and reliability compared to state-of-the-art FL methods, particularly in large-scale settings with limited participation. These results establish GHBM as a reliable foundation for distributed learning in environments with imperfect data.
Submission Number: 28
Loading