Provably Communication-Efficient Federated Graph Neural Network

ICLR 2026 Conference Submission15921 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Graph Neural Network, Fraud Detection
Abstract: Graph neural networks (GNNs) are powerful tools for relational data, but their application is often limited by data silos and privacy concerns, as real-world graphs are frequently distributed across multiple clients. While federated learning (FL) offers a privacy-preserving training paradigm, existing federated GNN approaches suffer from a critical flaw: they either ignore the crucial links between clients, sacrificing accuracy, or require impractically high communication overhead. We introduce CE-FedGNN, a communication-efficient federated GNN framework for such coupled graphs. Instead of sharing raw data or per-iteration embeddings, CE-FedGNN infrequently transmits only aggregated, high-level embeddings, preserving critical structural context while minimizing privacy leakage and communication costs. Despite the challenges of optimization under multi-layer composition and coupled data, we establish a convergence rate of $O(1/\sqrt{T})$ to a stationary point while the communication complexity is $O(T^{3/4})$. We further derive bounds for injecting Gaussian noise that provide formal differential privacy. Our experiments on a synthetic interbank anti-money laundering task show that the effectiveness of CE-FedGNN, which can be preserved even with injected Gaussian noise for differential privacy.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 15921
Loading