Keywords: Graph Transformers, Node classification, Position Encoding, Graph Laplacian, Heterophily
TL;DR: We show that graph transformers that use low-frequency Laplacian eigenvectors underutilize the graph spectrum, and that incorporating higher-order eigenvectors into position encodings can lead to substantial improvements on node classification tasks.
Abstract: Graph transformers have emerged as powerful tools for modeling complex graph-structured data, offering the ability to capture long-range dependencies. They require explicit positional encodings to inject structural information, which are most commonly derived from the eigenvectors of the graph Laplacian. However, existing approaches utilize only a small set of low-frequency eigenvectors, assuming that smooth global structure is sufficiently informative. We show that, for the task of node classification, it is possible to exploit a much broader spectrum of eigenvectors and achieve significant gains, especially in heterophilic graphs. Additionally, we introduce a first-principles approach for ranking and selecting eigenvectors based on their importance for node classification. Our method is plug-and-play and delivers substantial improvements across diverse benchmarks, elevating even vanilla graph transformers to match or surpass state-of-the-art models.
Submission Number: 110
Loading