SHAKE-GNN: Scalable Hierarchical Kirchhoff-Forest Graph Neural Network

ICLR 2026 Conference Submission17227 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks, Kirchhoff Forests, Hierarchical Graph Coarsening, Scalability, Graph Classification
TL;DR: We propose SHAKE-GNN, a scalable graph neural network that uses Kirchhoff Forest–based multi-resolution coarsening to balance efficiency and accuracy in graph classification.
Abstract: Graph Neural Networks (GNNs) have achieved remarkable success across a range of learning tasks. However, scaling GNNs to large graphs remains a significant challenge, especially for graph-level tasks. In this work, we introduce SHAKE-GNN, a novel scalable graph-level GNN framework based on a hierarchy of Kirchhoff Forests, a class of random spanning forests used to construct stochastic multi-resolution decompositions of graphs. SHAKE-GNN produces multi-scale representations, enabling flexible trade-offs between efficiency and performance. We introduce an improved, data-driven strategy for selecting the trade-off parameter and analyse the time-complexity of SHAKE-GNN. Experimental results on multiple large-scale graph classification benchmarks demonstrate that SHAKE-GNN achieves competitive performance while offering improved scalability.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 17227
Loading