Prototype-Guided Federated Knowledge Distillation Approach in LEO Satellite-HAP System

Published: 01 Jan 2025, Last Modified: 06 Nov 2025ICC 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Low Earth orbit (LEO) satellites nowadays play a pivotal role in collecting images for the Earth observation. However, the images collected by satellites are possibly tremendous, which causes challenges in dealing with the satellite images. Those challenges include: 1) the unrealistic of transmitting those massive image data to the ground station for centralized analysis because of restricted satellite communication bandwidth and the data privacy issue, and 2) satellite data may be non-independent and identically distributed (non-IID). In this paper, we propose a prototype-guided federated knowledge distillation (Pro-FedKD) approach in an LEO Satellite-high altitude platform (HAP) system, which is designed based on self-knowledge distillation (SKD), federated prototype learning (FedProto) and federated learning (FL). Owing to the adoption of FL, the first challenge can be handled since FL does not require data to leave the local side. To cope with the second challenge, SKD and FedProto are employed. In addition, both model aggregation and prototype aggregation are employed on a pre-defined HAP. To enhance the effectiveness, a top-$N$ model aggregation mechanism is proposed, in which among all models, $N$ local models that can achieve the top $N$ maximum accuracies over the validation dataset of the pre-defined HAP will be selected for aggregation. Experiments demonstrate the error rate gained by the proposed Pro-FedKD method is separately 3.76×, 3.17×, 1.55×, and 1.18× smaller than FedExP, MOON, FedProto, and pFedSD over the EuroSAT dataset, demonstrating a significant reduction. The proposed method also exhibits preeminence in other datasets.
Loading