What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding

Published: 01 Jan 2024, Last Modified: 19 Apr 2025ICML 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Loading