Class-Grouped-Normalized-Momentum and Faster Hyperparameter Exploration to Tackle Class Imbalance in Federated Learning

16 Sept 2025 (modified: 19 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Class Imbalance, Hyperparameter Optimization, Long Tail Learning
TL;DR: FedPGNM mitigates class imbalance in federated learning by grouping classes, applying unit norm per-group momentum, and summing them for an update direction. FedHOO uses XAB based search to select efficient client sampling rates in small federations.
Abstract: Local class imbalance rooted at global imbalance poses a critical challenge in federated learning (FL), where underrepresented classes suffer from poor predictive performance yet cannot be addressed by standard centralized techniques due to privacy and heterogeneity constraints. We propose FedCGNM (Federated Class-Grouped Normalized Momentum), a client-side optimizer in FL that partitions classes into a small number of groups, maintains a momentum per group, normalizes each group momentum to unit length, and uses the summation of the normalized group momentums as an update direction. This design both equalizes gradient magnitude across majority and minority groups and mitigates the noise inherent in rare-class gradients. Additionally, a resampling mechanism is employed to further mitigate class imbalance. To select sampling rates at clients efficiently in small-client federations, we propose FedHOO, an X-armed-bandit (XAB) based algorithm that exploits federated parallelism that evaluates many combinations of two candidate rates per client at linear cost. Empirical evaluation on four public long-tailed benchmarks and a proprietary chip-defect dataset demonstrates that FedCGNM consistently outperforms baselines and that coupling with FedHOO yields further improvements in small-scale federation.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 8109
Loading