Preserving Spatial-Temporal Relationship with Adaptive Node Sampling in Hierarchical Dynamic Graph Transformers
Keywords: Graph Transformer, Laplacian Positional Encoding, Structural Encoding, Adaptive Sampling
Verify Author List: I have double-checked the author list and understand that additions and removals will not be allowed after the submission deadline.
Abstract: Dynamic Graph Transformers (DGTs) have demonstrated remarkable performance in various applications, such as social networks, traffic forecasting, and recommendation systems. Despite their effectiveness in capturing long-range dependencies, training DGTs for large graphs remains a challenge. Mini-batch training is usually used to alleviate this challenge but this approach often fails to capture complex dependencies or sacrifice performance.
To deal with the above problems, we propose the $\underline{A}$daptive Node $\underline{S}$ampling in $\underline{H}$ierarchical $\underline{D}$ynamic $\underline{G}$raph $\underline{T}$ransformers (ASH-DGT) architecture that focuses on sampling the set of suitable nodes preserving spatial-temporal relationships in the dynamic graph for training DGTs.
Unlike previous methods that use random sampling or structural sampling, our motivation is that the contribution of nodes to learning performance can be time-sensitive, while we still care about spatial correlation in the dynamic graph with consideration to the global and local structure of the graph.
Through extensive evaluations on popular real-world datasets for node classification and link prediction, ASH-DGT consistently outperforms multiple state-of-the-art methods, achieving both higher accuracy and significant improvements in training efficiency.
A Signed Permission To Publish Form In Pdf: pdf
Primary Area: Deep Learning (architectures, deep reinforcement learning, generative models, deep learning theory, etc.)
Paper Checklist Guidelines: I certify that all co-authors of this work have read and commit to adhering to the guidelines in Call for Papers.
Student Author: No
Submission Number: 344
Loading