EAGLES: Towards Effective, Efficient, and Economical Federated Graph Learning via Unified Sparsification

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Federated Graph Learning (FGL) has gained significant attention as a privacy-preserving approach to collaborative learning, but the computational demands increase substantially as datasets grow and Graph Neural Network (GNN) layers deepen. To address these challenges, we propose $\textbf{EAGLES}$, a unified sparsification framework. EAGLES applies client-consensus parameter sparsification to generate multiple unbiased subnetworks at varying sparsity levels, reducing the need for iterative adjustments and mitigating performance degradation. In the graph structure domain, we introduced a dual-expert approach: a $\textit{graph sparsification expert}$ uses multi-criteria node-level sparsification, and a $\textit{graph synergy expert}$ integrates contextual node information to produce optimal sparse subgraphs. Furthermore, the framework introduces a novel distance metric that leverages node contextual information to measure structural similarity among clients, fostering effective knowledge sharing. We also introduce the $\textbf{Harmony Sparsification Principle}$, EAGLES balances model performance with lightweight graph and model structures. Extensive experiments demonstrate its superiority, achieving competitive performance on various datasets, such as reducing training FLOPS by 82\% $\downarrow$ and communication costs by 80\% $\downarrow$ on the ogbn-proteins dataset, while maintaining high performance.
Lay Summary: Federated graph learning faces serious scalability issues as model sizes and graph complexities continue to grow. In this work, we propose EAGLES to address the growing computational and communication challenges in federated graph learning. We design a unified framework that not only reduces redundant information in both graph structures and model parameters but also respects the privacy constraints of real-world applications. Through carefully crafted sparsification strategies and cross-client structural alignment, EAGLES enables efficient and scalable training while preserving performance. The method demonstrates substantial improvements, achieving up to 80% reduction in training and communication costs while maintaining competitive accuracy.
Link To Code: https://github.com/ZitongShi/EAGLES
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Federated Learning, Graph Learning, Sparsification
Submission Number: 10062
Loading