Keywords: Signed Link Prediction, Graph Transformer, Contrastive Learning
Abstract: Signed link prediction in bipartite graphs is a fundamental task with wide-ranging applications, yet it poses significant challenges. Current Graph Neural Networks are inherently local due to their message-passing nature, preventing them from capturing the long-range dependencies crucial for accurate prediction. Furthermore, they often fail to model complex real-world data distributions characterized by severe class imbalance and rich intra-class multimodality. To overcome these limitations, we propose the Hierarchical Prototypical Contrastive Sign-aware Graph Transformer (HPC-SGT). At its core, our framework features a Sign-aware Graph Transformer that operates on the line graph dual, leveraging novel spectral and motif-based inductive priors to learn structurally-aware global representations. This expressive encoder is optimized via a hierarchical prototypical objective, which learns a geometrically structured embedding space. It couples a class-balanced contrastive loss to robustly handle data imbalance with clustering and separation regularizers to explicitly model multi-modal class structures. The framework is unified by a cross-view consistency mechanism that grounds the learned semantic representations in the graph's foundational topology, bridging the structure-semantics gap. Extensive experiments on challenging benchmarks show that HPC-SGT significantly outperforms a wide range of state-of-the-art methods. Ablation studies further validate the contribution of each component, establishing HPC-SGT as a new, powerful, and principled solution for signed link prediction. Our code is available in the supplementary materials.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 6722
Loading