FedCLR+: Tackling Onboard Label Constraints for Accurate Federated Satellite Computing

Published: 2025, Last Modified: 04 Nov 2025IEEE Trans. Serv. Comput. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The rapid growth of Low Earth Orbit (LEO) satellites, particularly with the increasing deployment of intelligent computing capabilities using commercial off-the-shelf (COTS) hardware, presents significant opportunities to enhance the quality of in-orbit services. However, the current onboard conditions remain insufficient to enhance model accuracy by increasing model size, and inadequate accuracy hampers the effectiveness of in-orbit services. The satellite-ground federated learning (FL) paradigm, leveraging collaborative fine-tuning, offers a promising solution to continuously improve onboard model performance. Prior studies have focused on optimizing fine-tuning under constraints like limited bandwidth and computational resources, they often overlook two critical challenges: the scarcity and skewness of labeled onboard data and the long revisit cycles of satellites. To address these challenges and better support in-orbit services, this article designs a realistic simulation methodology for the onboard fine-tuning process and conducts a comprehensive measurement study. Based on insights from the measurement results, we propose an efficient satellite-ground federated fine-tuning system, FedCLR+. In this system, we design a FedCLR algorithm to enhance system accuracy through representation optimization. Additionally, we propose a hybrid bias-compensated strategy to further mitigate accuracy loss by enriching the diversity of aggregation information. Experimental results show that FedCLR+ significantly enhances accuracy by up to 21.61×, reduces transmission volume by an average of 7.29%, and maintaining acceptable additional overhead compared to baselines.
Loading