Uplink-Aware Federated Learning Based on Model Pruning in Satellite Networks

Published: 06 Jun 2025, Last Modified: 06 Jun 2025ICML Workshop on ML4WirelessEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Low earth orbit (LEO) satellites, Federated learning, Link scheduling, Neural network pruning
TL;DR: This work jointly uses uplink scheduling and neural network pruning to alleviate the ground-satellite uplink problems in satellite federated learning.
Abstract: Satellite federated learning (SFL) allows satellites to collaboratively train models without sharing raw data, enhancing privacy and reducing communication costs. Traditional SFL requires a ground station (GS) to upload models to satellites, under the premise of adequate ground-satellite uplink (GSUL) resources. However, this assumption does not hold in dense LEO constellations, where frequent command interaction or parameter delivery make the bandwidth-constrained uplink a bottleneck. This work proposes satellite federated learning with uplink scheduling and model pruning (FedLSMP). The key idea behind this is jointly optimizing the GSUL bandwidth allocation plan and model compression ratio to maximize the approximated loss reduction, adhering to bandwidth constraints. Finally, numerical results demonstrate that FedLSMP improves convergence rates while reducing GSUL bandwidth usage, achieving higher overall effectiveness compared with conventional SFL approaches.
Submission Number: 10
Loading