Uncovering Strong Lottery Tickets in Graph Transformers: A Path to Memory Efficient and Robust Graph Learning

Published: 24 Mar 2025, Last Modified: 24 Mar 2025Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Graph Transformers (GTs) have recently demonstrated strong capabilities for capturing complex relationships in graph-structured data using global self-attention mechanisms. On the other hand, their high memory requirements during inference remain a challenge for practical deployment. In this study, we investigate the existence of strong lottery tickets (SLTs) — subnetworks within randomly initialized neural networks that can attain competitive accuracy without weight training — in GTs. Previous studies have explored SLTs in message-passing neural networks (MPNNs), showing that SLTs not only exist in MPNNs but also help mitigate over-smoothing problems and improve robustness. However, the potential of SLTs in GTs remains unexplored. With GTs having 4.5$\times$ more parameters than MPNNs, SLTs hold even greater application value in this context. We find that fixed random weights with a traditional SLT search method cannot adapt to imbalances of features in GTs, leading to highly biased attention that destabilizes model performance. To overcome this issue and efficiently search for SLTs, we introduce a novel approach called Adaptive Scaling. We empirically confirm the existence of SLTs within GTs and demonstrate their versatility through extensive experiments across different GT architectures, including NodeFormer, GRIT, and GraphGPS. Our findings demonstrate that SLTs achieve comparable accuracy while reducing memory usage by 2--32$\times$, effectively generalize to out-of-distribution data, and enhance robustness against adversarial perturbations. This work highlights that SLTs offer a resource-efficient approach to improving the scalability, efficiency, and robustness of GTs, with broad implications for applications involving graph data.
Submission Length: Regular submission (no more than 12 pages of main content)
Supplementary Material: zip
Assigned Action Editor: ~Wenbing_Huang1
Submission Number: 3814
Loading