GrokFormer: Graph Fourier Kolmogorov-Arnold Transformers

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: GrokFormer is a novel Graph Transformer that learns expressive, adaptive spectral filters through order-K Fourier series modeling, overcoming the limitations of self-attention and improving graph representation learning performance.
Abstract: Graph Transformers (GTs) have demonstrated remarkable performance in graph representation learning over popular graph neural networks (GNNs). However, self-attention, the core module of GTs, preserves only low-frequency signals in graph features, leading to ineffectiveness in capturing other important signals like high-frequency ones. Some recent GT models help alleviate this issue, but their flexibility and expressiveness are still limited since the filters they learn are fixed on predefined graph spectrum or spectral order. To tackle this challenge, we propose a Graph Fourier Kolmogorov-Arnold Transformer (GrokFormer), a novel GT model that learns highly expressive spectral filters with adaptive graph spectrum and spectral order through a Fourier series modeling over learnable activation functions. We demonstrate theoretically and empirically that the proposed GrokFormer filter offers better expressiveness than other spectral methods. Comprehensive experiments on 10 real-world node classification datasets across various domains, scales, and graph properties, as well as 5 graph classification datasets, show that GrokFormer outperforms state-of-the-art GTs and GNNs. Our code is available at https://github.com/GGA23/GrokFormer.
Lay Summary: Graph Transformers (GTs) have shown remarkable performance in graph representation learning. However, their core component, self-attention, primarily retains low-frequency signals, making it difficult to model graphs with diverse properties. This paper presents GrokFormer, a novel GT model that learns expressive and adaptive spectral filters through order‑K Fourier series modeling, overcoming the limitations of self-attention by effectively capturing rich frequency signals across a broad range of graph spectra and orders. Comprehensive experiments on both synthetic and real-world datasets demonstrate the superiority of GrokFormer over state-of-the-art GNNs and GTs.
Link To Code: https://github.com/GGA23/GrokFormer
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Graph Transformers, Kolmogorov-Arnold Networks, Fourier Series, Spectral Graph Neural Networks
Submission Number: 4830
Loading