AH-UGC: $\underline{\text{A}}$daptive and $\underline{\text{H}}$eterogeneous-$\underline{\text{U}}$niversal $\underline{\text{G}}$raph $\underline{\text{C}}$oarsening

ICLR 2026 Conference Submission13359 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Coarsening, Graph Neural Networks, Scaling Graph Learning, Hashing
TL;DR: We propose a fast, hashing-based framework for adaptive and heterogeneous graph coarsening that supports multiple coarsening ratios and preserves semantic structure.
Abstract: $\textbf{Graph Coarsening (GC)}$ is a prominent graph reduction technique that compresses large graphs to enable efficient learning on graphs. However, existing GC methods generate only one coarsened graph per run and must recompute from scratch for each new coarsening ratio, resulting in unnecessary overhead. Moreover, most prior approaches are tailored to $\textit{homogeneous}$ graphs and fail to accommodate the semantic constraints of $\textit{heterogeneous}$ graphs, which comprise multiple node and edge types. To overcome these limitations, we introduce a novel framework that combines Locality-Sensitive Hashing (LSH) with Consistent Hashing (CH) to enable $\textit{adaptive graph coarsening}$. Leveraging hashing techniques, our method is inherently fast and scalable. For heterogeneous graphs, we propose a $\textit{type-isolated coarsening}$ strategy that ensures semantic consistency by restricting merges to nodes of the same type. Our approach is the first unified framework to support both adaptive and heterogeneous coarsening. Extensive evaluations on 23 real-world datasets including homophilic, heterophilic, homogeneous, and heterogeneous graphs demonstrate that our method achieves superior scalability while preserving the structural and semantic integrity of the original graph.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 13359
Loading