Effect of Geometry on Graph Neural Networks

TMLR Paper5016 Authors

02 Jun 2025 (modified: 05 Jun 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Hyperbolic Graph Neural Networks (GNNs) have emerged as a promising approach for modeling graph-structured data with less embedding distortion than Euclidean GNNs. In this paper, we explore the effect of geometry on the performance of three types of GNNs for node classification and link prediction. To do so, we build on the hyperbolic framework outlined in Chen et al. (2022) and propose a family of GNNs with alternating geometry, integrating both hyperbolic and Euclidean components that can be trained jointly. We compare our alternating geometry models’ performance and stability against their Euclidean and hyperbolic counterparts across various datasets. Finally, we examine the impact of the choice of geometry and graph properties on hyperparameter selection. The alternating geometry models achieved the best performance in node classification, while the hyperbolic models outperformed alternating and Euclidean models in link prediction. Additionally, for node classification, architecture choice had a greater impact on performance than geometry, whereas for link prediction, geometry had a more significant effect than architecture.
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Christopher_Morris1
Submission Number: 5016
Loading