Less is More: Federated Graph Learning with Alleviating Topology Heterogeneity from A Causal Perspective

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Federated graph learning (FGL) aims to collaboratively train a global graph neural network (GNN) on multiple private graphs with preserving the local data privacy. Besides the common cases of data heterogeneity in conventional federated learning, FGL faces the unique challenge of topology heterogeneity. Most of existing FGL methods alleviate the negative impact of heterogeneity by introducing global signals. However, the manners of creating increments might not be effective and significantly increase the computation amount. In light of this, we propose the FedATH, an FGL method with Alleviating Topology Heterogeneity from a causal perspective. Inspired by the causal theory, we argue that not all edges in a topology are necessary for the training objective, less topology information might make more sense. With the aid of edge evaluator, the local graphs are divided into causal and biased subgraphs. A dual-GNN architecture is used to encode the two subgraphs into corresponding representations. Thus, the causal representations are drawn closer to the training objective while the biased representations are pulled away from it. Further, the Hilbert-Schmidt Independence Criterion is employed to strengthen the separability of the two subgraphs. Extensive experiments on six real-world graph datasets are conducted to demonstrate the superiority of the proposed FedATH over the compared approaches.
Lay Summary: Currently, many organizations, like banks and companies, have their own network data, such as user connections or transaction links. But because of privacy concerns, they can’t just share this data with others. Our research focuses on how multiple organizations can work together to train a smart system that learns from their network data without ever sharing the raw data itself. This is difficult because each organization’s data is structured differently, like having different kinds of connections or network shapes. Most current solutions try to smooth out these differences, but that often requires a lot of computing power and doesn't always work well. We propose a new method called FedATH. The key idea is: not every connection in a network is useful, some may actually be distracting. Then, we break each local network into two parts: one with meaningful connections and one with less helpful or biased ones. We then treat them differently during training. Our system also includes a way to make sure these two parts stay separate, which helps the model focus on what really matters. We tested our method on six real-world datasets, and it consistently outperformed other approaches, showing it’s both more effective and more efficient.
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Federated learning, Graph neural networks, Topology heterogeneity
Submission Number: 9517
Loading