Optimizing Federated Learning Client Selection via Multi-Objective Contextual Bandits

TMLR Paper4812 Authors

09 May 2025 (modified: 23 Oct 2025)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: In the rapidly evolving field of Machine Learning (ML), Federated Learning (FL) emerges as an innovative approach for training models across distributed devices without centralizing raw data. However, FL faces significant challenges due to the heterogeneous nature of client devices, leading to non-IID data distributions and various resource constraints. Moreover, the inherent bandwidth limitations in decentralized settings necessitate the efficient use of both network and energy resources. Energy efficiency—i.e., minimizing energy consumed per unit time of training—not only reduces battery strain but also cuts down on unnecessary data transmissions. This reduction in energy usage not only improves network efficiency but also contributes to environmental sustainability. To address these challenges, we introduce a novel solution, Pareto Contextual Zooming for Federated Learning (PCZFL), which treats the client selection problem in FL as a multi-objective bandit problem. Our method focuses on optimizing both global accuracy and energy efficiency in parallel. By dynamically adjusting client selection based on real-time accuracy and energy context, the proposed solution ensures effective participation while minimizing energy consumption. In addition, we provide theoretical analysis on both the regret bound and time complexity of our method. Extensive experiments on CIFAR-10 demonstrate that PCZFL achieves a 4.7% improvement in global accuracy with 37.3% lower energy cost compared to Pow-d, the second-best in global accuracy, and delivers a 9.5% accuracy gain while reducing energy consumption by 30.9% relative to NCCB, the second-best in energy efficiency. On FMNIST, PCZFL further achieves a 3.2% improvement in global accuracy and a 21.4% reduction in energy cost relative to Pow-d, while delivering a 3.7% accuracy gain and a 30.8% reduction in energy consumption relative to NCCB.
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=0A8sG8LYbc
Changes Since Last Submission: In this revision, we have carefully addressed the comments from all reviewers. To facilitate checking, the changes are highlighted in red throughout the manuscript.
Assigned Action Editor: ~Tian_Li1
Submission Number: 4812
Loading