FedGVD: Efficient Federated Graph Learning via Unidirectional Distillation with Dynamic Virtual Nodes

Published: 09 Nov 2025, Last Modified: 16 Feb 2026Proceedings of the 34th ACM International Conference on Information and Knowledge ManagementEveryoneCC BY 4.0
Abstract: Federated Graph Learning (FGL) has emerged as a key paradigm for distributed graph machine learning, enabling cross-domain graph collaborative modeling while preserving data privacy. However, existing methods face two major bottlenecks: the structural heterogeneity discrepancy of graph data among clients weakens the generalization ability of the global model; and model heterogeneity leads to inefficient knowledge sharing and complex global aggregation. To address these issues, we propose FedGVD, an efficient framework that constructs a global perspective through data condensation and server-side virtual node generation, which not only preserves the semantic equivalence of the original data but also avoids privacy leakage. Subsequently, by distributing low-dimensional generalizable knowledge for unidirectional distillation, FedGVD enables local models to absorb global knowledge without transmitting local parameters, thus breaking through the challenges of data and structural heterogeneity as well as model heterogeneity. This innovative approach ensures privacy-preserving and efficient federated graph collaboration. Experiments show that FedGVD maintains excellent performance in heterogeneous model scenarios while significantly improving communication efficiency, offering a new approach for privacy-preserving collaborative modeling in FGL. The code is available at \url{https://github.com/Jasonxx4/FedGVD}.
Loading