Federated Spectral Graph Transformers Meet Neural Ordinary Differential Equations for Non-IID Graphs
Abstract: Graph Neural Network (GNN) research is rapidly advancing due to GNNs’ capacity to learn distributed representations from graph-structured data. However, centralizing large volumes of real-world graph data for GNN training is often impractical due to privacy concerns, regulatory restrictions, and commercial competition. Federated learning (FL), a distributed learning paradigm, offers a solution by preserving data privacy with collaborative model training. Despite progress in training huge vision and language models, federated learning for GNNs remains underexplored. To address this challenge, we present a novel method for federated learning on GNNs based on spectral GNNs equipped with neural ordinary differential equations (ODE) for better information capture, showing promising results across both homophilic and heterophilic graphs. Our approach effectively handles non-Independent and Identically Distributed (non-IID) data, while also achieving performance comparable to existing methods that only operate on IID data. It is designed to be privacy-preserving and bandwidth-optimized, making it suitable for real-world applications such as social network analysis, recommendation systems, and fraud detection, which often involve complex, non-IID, and heterophilic graph structures. Our results in the area of federated learning on non-IID heterophilic graphs demonstrate significant improvements, while also achieving better performance on homophilic graphs. This work highlights the potential of federated learning in diverse and challenging graph settings.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We thank the reviewers for their constructive feedback. We have addressed all comments and implemented the following changes in the revised manuscript:
- *Alpha Notation Fix*: All instances of the string "alpha" were replaced with the proper LaTeX math notation $\alpha$ to align with the standard usage in federated learning literature.
- *Citation Format Corrections*: We carefully revised all citations to ensure correct usage of textual citations via \citet{} and parenthetical citations via \citep{}, depending on context.
- *Figure Revisions*: As suggested, we improved clarity in Figure 2 by refining annotations and visual grouping to avoid confusion.
- *Additional Experiments (IID Sensitivity)*: We added new experiments to investigate the effect of increasing IID-ness (i.e., larger $\alpha$ values in Dirichlet partitioning). These results are included in Section 5 and Appendix B. As shown in the updated Table 3, although performance may decline with higher $\alpha$ in some settings, this is consistent with prior findings when client homogeneity introduces overfitting to local structures.
- *Scaling Evaluation*: We included experimental results on larger datasets (Penn94, Arxiv-Year) to demonstrate the model’s scalability with increasing graph size. These results, discussed in Section 5.2, show that Fed-GNODEFormer maintains strong performance as the number of nodes and clients scale.
- *Hyperparameter Tuning for Neural ODE*: We provided details of key hyperparameters specific to the ODE component in Appendix D. We found that tuning the step size $\epsilon$, the number of function evaluations, and the depth of the residual history layer had notable effects on stability and convergence.
- *Expanded Related Work*: We significantly extended the Related Work section to address reviewer comments:
- Discussed A Survey of Graph Neural Networks in Real World: Imbalance, Noise, Privacy, and OOD Challenges \citep{zhao2023survey} to contextualize our work under real-world graph learning constraints.
- Added discussion of ODE-GCN \citep{zhuang2020odegcn} and PDE-GCN \citep{eliasof2021pdegcn}, clarifying how our model extends continuous-time GNNs to federated and non-IID environments.
- *Video & Code*: A narrated video presentation was prepared and linked as part of the submission. Code has been open-sourced and is available at the following [repository.](https://github.com/SpringWiz11/Fed-GNODEFormer)
Video: https://iiithydresearch-my.sharepoint.com/:v:/g/personal/himanshu_pal_research_iiit_ac_in/EauLgP-ap0NIpfOR3aUaFhcBSnnDCqp3FdJO_7bDk5chLg?nav=eyJyZWZlcnJhbEluZm8iOnsicmVmZXJyYWxBcHAiOiJPbmVEcml2ZUZvckJ1c2luZXNzIiwicmVmZXJyYWxBcHBQbGF0Zm9ybSI6IldlYiIsInJlZmVycmFsTW9kZSI6InZpZXciLCJyZWZlcnJhbFZpZXciOiJNeUZpbGVzTGlua0NvcHkifX0&e=KeuQWc
Code: https://github.com/SpringWiz11/Fed-GNODEFormer
Assigned Action Editor: ~Peilin_Zhao2
Submission Number: 3797
Loading