FedPPD: Towards Effective Subgraph Federated Learning via Pseudo Prototype Distillation

Published: 22 Apr 2025, Last Modified: 13 May 2025Neural NetworksEveryoneRevisionsCC BY-NC-ND 4.0
Abstract: Subgraph federated learning (subgraph-FL) is a distributed machine learning paradigm enabling cross-client collaborative training of graph neural networks (GNNs). However, real-world subgraph-FL scenarios often face subgraph heterogeneity problem, i.e., variations in nodes and topology across multiple subgraphs. As a result, the global model experiences a decline in performance. Despite several well-designed methods being proposed, most still rely on parameter aggregation-based global GNN for inference, which oversimplifies the subgraph knowledge and leads to sub-optimal performance. To this end, we propose achieving effective subgraph federated learning via pseudo prototype distillation (FedPPD). Specifically, FedPPD first utilizes a generator under the guidance of local prototypes to explore the global input space. Subsequently, the generated pseudo graph is used for distilling knowledge from the local GNNs to the vanilla-aggregated global GNN to convey reliable knowledge oversimplified during aggregation. Extensive experimental validation on six public datasets demonstrates that FedPPD consistently outperforms state-of-the-art baselines. Our code is available at https://github.com/KyrieLQ/FedPPD.
Loading