Efficient Subgraph GNNs via Graph Products and Coarsening

Published: 23 Oct 2024, Last Modified: 24 Feb 2025NeurReps 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Subgraph GNNs, Equivariance, Symmetries
TL;DR: This paper presents a Subgraph GNN framework using graph coarsening for efficient subgraph selection and uncovers new permutation symmetries, leading to improved performance and flexibility over previous methods.
Abstract: Subgraph Graph Neural Networks (Subgraph GNNs) improve message-passing GNNs by representing graphs as a set of subgraphs, achieving strong performance, but their complexity limits applications to larger graphs. Previous methods use random or learnable sampling of subgraph subsets, but these lead to suboptimal subgraph selections or restricted subset sizes, causing performance drops. This paper presents a new framework to overcome these challenges. We use a graph coarsening function to cluster nodes into super-nodes with induced connectivity. The product of the coarsened and original graph reveals an implicit structure, where subgraphs are tied to specific node sets. By applying generalized message-passing to this graph product, we create an efficient and powerful Subgraph GNN. Unlike previous methods, our approach allows flexible subgraph selection and is compatible with standard training. Additionally, we uncover new permutation symmetries in the resulting node feature tensor, which we leverage by designing linear equivariant layers for our Subgraph GNN architecture. Extensive experiments on several datasets show our method is more flexible than previous approaches, effortlessly handling any number of subgraphs while consistently outperforming baselines.
Submission Number: 47
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview