Efficient and Expressive Graph Neural Networks

Published: 23 Sept 2025, Last Modified: 28 Oct 2025NPGML PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph neural networks, expressiveness
Abstract: Graph neural networks (GNNs) have achieved remarkable success but remain limited in distinguishing non-isomorphic graphs with similar local structures, due to their reliance on neighborhood aggregation. Higher-order or subgraph-counting GNNs offer greater expressivity but at prohibitive computational cost. We introduce \emph{Polynomial-time Cycle basis-GNNs} (PCB-GNNs), a topology-aware architecture that augments message passing with polynomial-time cycle-basis features, capturing essential global invariants overlooked by standard MPNNs. PCB-GNNs distinguishes challenging graph families where 1-WL and typical GNNs fail, while incurring only polynomial overhead. Experiments across synthetic, molecular, and protein benchmarks show that PCB-GNNs consistently outperforms state-of-the-art models on both expressiveness and large-scale tasks. PCB-GNNs achieve classification accuracy on MUTAG (98.53\%), PTC (86.48\%), PROTEINS (82.21\%), and NCI1 (88.37\%) while scaling effectively to larger datasets such as IMDB-B. On the ZINC-Subset molecular regression task, our model attains MAE of 0.054.
Submission Number: 117
Loading