Bayesian Neighborhood Adaptation for Graph Neural Networks

Published: 08 Jul 2025, Last Modified: 08 Jul 2025Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: The neighborhood scope (i.e., number of hops) where graph neural networks (GNNs) aggregate information to characterize a node's statistical property is critical to GNNs' performance. Two-stage approaches, training and validating GNNs for every pre-specified neighborhood scope to search for the best setting, is a time-consuming task and tends to be biased due to the search space design. How to adaptively determine proper neighborhood scopes for the aggregation process for both homophilic and heterophilic graphs remains largely unexplored. We thus propose to model the GNNs' message-passing behavior on a graph as a stochastic process by treating the number of hops as a beta process. This Bayesian framework allows us to infer the most plausible neighborhood scope for message aggregation simultaneously with the optimization of GNN parameters. Our theoretical analysis shows that the scope inference improves the expressivity of a GNN. Experiments on benchmark homophilic and heterophilic datasets show that the proposed method is compatible with state-of-the-art GNN variants, achieving competitive or superior performance on the node classification task, and providing well-calibrated predictions.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Following major changes are done since the original submission: - **Literature Review:** - Addition of more related works in Section 2.1, detailed discussion of Bayesian methods in graph learning and their advantages. - **Methodology:** - Clarifying the motivation behind using Bayesian inference and the rationale behind leveraging the beta process to model neighborhood scope in Sections 3 and 3.1. - Establishing a connection between the subsections 3.1, 3.2 and 3.3, improving the flow. - **Experiments:** - Providing an overview of the experiments and their objectives in section 5. - Adding more discussion on computational complexity and the justification of increased complexity. - **Appendix:** - Addition of new sections: Appendix L, M, and N for analysis of the effect of increasing the number of samples $S$, the prior parameters $\alpha$ and $\beta$, and an additional experiment on link prediction, respectively. - Duly referencing the appendix sections in the main text wherever applicable for better readability.
Supplementary Material: zip
Assigned Action Editor: ~Peilin_Zhao2
Submission Number: 4239
Loading