Keywords: Graph Neural Networks, Subgraph Federated Learning, Federated Learning
Abstract: Federated Learning (FL) on graphs enables collaborative model training to enhance performance without compromising the privacy of each client. However, previous methods often overlook the mutable nature of graph data, which frequently introduces new nodes and leads to shifts in label distribution. Unlike prior methods that struggle to generalize to unseen nodes with diverse label distributions, our proposed method, FedLoG, effectively addresses this issue by alleviating the problem of local overfitting. Our model generates global synthetic data by condensing the reliable information from each class representation and its structural information across clients. Using these synthetic data as a training set, we alleviate the local overfitting problem by adaptively generalizing the absent knowledge within each local dataset.
This enhances the generalization capabilities of local models, enabling them to handle unseen data effectively. Our model outperforms the baselines in proposed experimental settings, which are designed to measure generalization power to unseen data in practical scenarios.
Our code is available at https://github.com/sung-won-kim/FedLoG
Submission Number: 7
Loading