STARS-FL: Accelerating Federated Learning over Heterogeneous Mobile Devices via Spatial-Temporal Aware Reconfigured Subnetworks

20 Sept 2025 (modified: 18 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Distributed Machine Learning, Heterogeneous Devices, Subnetwork Extraction
Abstract: Federated learning (FL) on mobile devices faces challenges from inherent computing and communication heterogeneity across devices. Subnetwork FL training offers a promising solution by assigning each device the largest feasible subnetwork extracted from the global model for local training. However, existing subnetwork training methods rely on static subnetwork assignments across time and uniform extraction ratios across layers. They overlook (i) the dynamic requirements of local contributions across FL training in temporal domain and (ii) the varying importance of different layers within subnetworks in spatial domain, both of which strongly affect the FL training performance. In this paper, we propose to accelerate FL training over heterogeneous mobile devices via spatial and temporal aware reconfigured subnetworks (STARS-FL). Different from existing approaches, STARS-FL leverages Fisher Information to identify critical learning periods and makes mobile devices to correspondingly adjust their subnetworks across FL training process. Further, from spatial domain, STARS-FL introduces a novel layer-wise subnetwork width adjustment mechanism, which enables each device to reconfigure layer widths adaptively based on its layer-specific computational and communication overheads, its real-time computing/communication conditions and potential straggler effects. Compared with state-of-the-art subnetwork methods, our experiments demonstrate that STARS-FL effectively speed up FL training while maintaining competitive learning accuracy.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 23832
Loading