Communication-Efficient Satellite-Ground Federated Learning Through Progressive Weight Quantization

Published: 01 Jan 2024, Last Modified: 20 May 2025IEEE Trans. Mob. Comput. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Large constellations of Low Earth Orbit (LEO) satellites have been launched for Earth observation and satellite-ground communication, which collect massive imagery and sensor data. These data can enhance the AI capabilities of satellites to address global challenges such as real-time disaster navigation and mitigation. Prior studies proposed leveraging federated learning (FL) across satellite-ground to collaboratively train a share machine learning (ML) model in a privacy-preserving mechanism. However, they mostly focus on single unique challenges such as limited ground-to-satellite bandwidth, short connection window, and long connection cycle, while ignoring the completeness of these challenges in deploying efficient FL frameworks in space. In this paper, we propose an efficient satellite-ground FL framework, SatelliteFL, to address these three challenges collectively. Its key idea is to ensure that each satellite must complete per-round training within each connection window. Moreover, we design a progressive block-wise quantization algorithm that determines a unique bitwidth for each block of the ML model to maximize the model utility while not exceeding the connection window. We evaluate SatelliteFL by plugging an implemented FL platform into real-world satellite networks and satellite images. The results show that SatelliteFL highly accelerates the convergence by up to 2.8× and improves the bandwidth utilization ratio by up to 9.3× compared to the state-of-the-art methods.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview