IVQ-GNN: Mitigating Performance Gap from Graph Connection Pattern Inconsistency via Vector Quantization

Published: 12 Apr 2026, Last Modified: 08 May 2026OpenReview Archive Direct UploadEveryoneCC BY 4.0
Abstract: Heterophily in graphs is a key challenge for Graph Neural Networks (GNNs). By proposing various homophily measures, recent work has provided insights into how heterophily affects node classification. However, while both graph homophily and heterophily can be further refined into diverse connection patterns, previous work has largely overlooked the role of connection pattern inconsistency. In this paper, we delve deeper into heterophily and homophily by shifting from coarse-grained heterophily ratios to a unified, fine-grained formulation based on connection patterns, and we further reveal an uneven distribution and a train–test gap of these patterns. Empirical studies indicate that this inconsistency leads to severe performance disparity. To address this issue, we propose a novel two-stage method named IVQ-GNN. In the pretraining phase, IVQ-GNN encodes diverse connection patterns into a codebook that serves as an orthogonal basis for the representation space. In the fine-tuning phase, a self-attention module linearly combines these orthogonal bases to expand the learned token space of connection patterns, thereby improving adaptation to rare and out-of-distribution (OOD) patterns. Experimental results on multiple datasets demonstrate that IVQ-GNN significantly improves model performance and validate that the proposed method effectively addresses the connection pattern inconsistency. Our code is available at https://github.com/Duyx5149/IVQ-GNN.
Loading