Balancing Efficiency and Expressiveness: Subgraph GNNs with Walk-Based Centrality

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We introduce a framework which drastically reduces the computational cost of Subgraph GNNs by leveraging walk-based centralities, both as an effective subgraph sampling strategy and as a form of powerful Structural Encoding.
Abstract: Subgraph GNNs have emerged as promising architectures that overcome the expressiveness limitations of Graph Neural Networks (GNNs) by processing bags of subgraphs. Despite their compelling empirical performance, these methods are afflicted by a high computational complexity: they process bags whose size grows linearly in the number of nodes, hindering their applicability to larger graphs. In this work, we propose an effective and easy-to-implement approach to dramatically alleviate the computational cost of Subgraph GNNs and unleash broader applications thereof. Our method, dubbed HyMN, leverages walk-based centrality measures to sample a small number of relevant subgraphs and drastically reduce the bag size. By drawing a connection to perturbation analysis, we highlight the strength of the proposed centrality-based subgraph sampling, and further prove that these walk-based centralities can be additionally used as Structural Encodings for improved discriminative power. A comprehensive set of experimental results demonstrates that HyMN provides an effective synthesis of expressiveness, efficiency, and downstream performance, unlocking the application of Subgraph GNNs to dramatically larger graphs. Not only does our method outperform more sophisticated subgraph sampling approaches, it is also competitive, and sometimes better, than other state-of-the-art approaches for a fraction of their runtime.
Lay Summary: Graph Neural Networks (GNNs) are powerful tools used to analyze data that’s structured like a network—think social networks, molecules, or transportation systems. However, standard GNNs often fall short when it comes to capturing the full complexity of these structures. A newer class of models, called Subgraph GNNs, improves this by analyzing smaller, overlapping parts of the graph. The problem? These models can be computationally expensive, especially as the graphs grow larger, because they analyze many subgraphs per node. This work introduces HyMN, a simple yet effective way to cut down the computational cost. Instead of blindly analyzing lots of subgraphs, HyMN uses measures based on how "important" or "central" nodes are—based on how walks through the graph behave—to focus only on the most relevant parts. These centrality-based measures also turn out to help the model better understand the structure of the graph. Experiments show that HyMN is both faster and often more accurate than more complex alternatives, making it possible to apply Subgraph GNNs to much larger datasets than before.
Link To Code: https://github.com/jks17/HyMN/
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Graph Neural Networks, Subgraphs, Expressivity, Graph Centrality
Submission Number: 4570
Loading