G-Sparse: Compiler-Driven Acceleration for Generalized Sparse Computation for Graph Neural Networks on Modern GPUs

Published: 01 Jan 2023, Last Modified: 13 May 2025PACT 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Graph Neural Network (GNN) learning over non-Euclidean graph data has recently drawn a rapid increase of interest in many domains. Generalized sparse computation is crucial for maximizing the performance of GNN learning, while most recent GNNs primarily focused on optimizing coarse-grained parallelism associated with nodes, edges, and additional feature dimensions. However, efficiently implementing generalized sparse computation is challenging. The performance optimization of generalized sparse computation lacking in-depth architecture-aware design is seldom supported by existing Domain-Specific Languages (DSLs) and is hard to be tuned by experts, which involves substantial trial and error. In this work, we propose G-Sparse, a new compiler framework that extends the popular Halide compiler to enable effective acceleration for generalized sparse computations for GNNs through compiler-driven optimizations and auto-tuning. To facilitate generalized sparse computations, G-Sparse separates algorithms from schedules and introduces several novel sparse computation optimization techniques for modern GPUs, including two-dimensional shared memory optimizations and efficient cost-driven design space exploration and auto-tuning. Extensive evaluation against highly-optimized state-of-the-art sparse computation kernels and on end-to-end GNN training and inference efficiency has demonstrated that our proposed G-Sparse achieves up to a $4.75\times$ speedup over the state-of-the-art sparse kernels, and a training and inference speedup of $1.37\times\sim 2.25\times$ over three popular GNN frameworks including GCN, GraphSAGE, and GAT. The source code of G-Sparse is publicly available at https://github.com/TuGraph-family/tugraph-db/tree/master/learn.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview