Keywords: Graph Learning, Temporal Graph, Transformer, Link Prediction, Dynamic Graph
TL;DR: A generative Transformer-based model for dynamic graph learning with structural-temporal context and efficient sequence prediction.
Abstract: The prevailing paradigms in dynamic graph learning, which rely on local neighbor propagation or node-specific memory updates, are fundamentally limited in their ability to model global interaction patterns and to scale inference efficiently. Rethinking these challenges, we propose GET (Global Event Transformer), an event sequence modeling framework that reformulates dynamic graph learning as global event sequence modeling. By processing the entire interaction history as a unified stream of events, GET is designed to capture dependencies beyond the local receptive fields of prior methods, while its "encode once, score many" design removes the candidate-wise inference bottleneck of discriminative models. GET flexibly incorporates structural priors through lightweight GNN encoders and memory modules, which provide local structural and temporal cues that the global Transformer then reasons over. Our main focus in this paper is dynamic link prediction under standard Temporal Graph Benchmark (TGB) protocols, complemented by preliminary short-horizon generative evaluations on small/medium graphs. Extensive experiments on five large-scale TGB benchmarks and six additional datasets show that GET achieves strong competitive performance while delivering substantially faster inference throughput (up to 21.4 × on tgbl-wiki) compared to strong baselines.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 7662
Loading