Keywords: Graph Transformers, Node classification, Position Encoding, Graph Laplacian, Heterophily
TL;DR: We present a principled approach for selecting Laplacian eigenvectors to boost performance of standard graph transformers on node classification.
Abstract: Graph transformers have emerged as powerful tools for modeling complex graph-structured data, offering the ability to capture long-range dependencies beyond the graph adjacency. Yet their performance on node classification often lags behind that of message passing and spectral graph networks. Unlike these methods, graph transformers require explicit positional encodings to inject structural information, which are most commonly derived from the eigenvectors of the graph Laplacian. Existing methods select eigenvectors using data-agnostic heuristics, assuming one-size-fits-all rules suffice. In contrast, we show that the spectral distribution of class information is graph-specific. To address this, we introduce *Broaden the Spectrum* (BTS), a novel, intuitive, and data-driven algorithm for selecting subsets of Laplacian eigenvectors for node classification. Our method is grounded in theory: we characterize the structure of optimal attention matrices for classification and show, in a simplified setting, how BTS naturally emerges as the eigenvector selection rule for achieving such attention matrices. When evaluated with standard graph transformer architectures, it delivers substantial performance gains across a wide range of node classification benchmarks. Our work shows that the performance of graph transformers on node classification has been held back by the choice of positional encodings and can be improved by employing a broader, well-chosen set of Laplacian eigenvectors.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 22146
Loading