SimPlex-GT: A Simple Node-to-Cluster Graph Transformer for synergizing homophily and heterophily in Complex Graphs
Keywords: Graph neural networks, Homophilic and Heterophilic, Self Supervise Learning
Abstract: Graph neural networks (GNNs) have proven effective on homophilic graphs, where connected nodes share similar features. However, real-world graphs often exhibit mixed patterns including heterophily, where connected nodes differ significantly. Traditional GNNs struggle with such cases due to their inherent smoothing operations. To address this limitation, we propose SimPlex-GT, a novel Graph Transformer (GT) model that synergizes homophilic and heterophilic patterns by integrating local GNN message passing with a novel global node-to-cluster (N2C) attention mechanism. Our approach disentangles node representations into local and global components: local features model neighborhood similarity, while global features attend to dynamic cluster prototypes learned on the fly. A learnable gating mechanism fuses these complementary views, and an orthogonality constraint encourages representational diversity.
SimPlex-GT is trained under a self-supervised teacher–student architecture where the teacher sees the full graph and the student learns from masked inputs, with alignment enforced in a joint latent space. A dynamic masking strategy further emphasizes difficult nodes, based on prediction discrepancies. Comprehensive theoretical analysis demonstrates its strong capability, and extensive evaluations across 11 benchmark datasets show that SimPlex-GT achieves state-of-the-art performance on heterophilic graphs and remains highly competitive on homophilic graphs, all with superior computational efficiency.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 19229
Loading