Track: Social networks and social media
Keywords: Dynamic graphs, Transformer, Link prediction, Topology
TL;DR: We propose TopDyG, a topology-aware dynamic graph Transformer for link prediction. It simplifies capturing topological features like triangles, outperforming the SOTA baseline with a 43% NDCG and 27% Jaccard boost.
Abstract: Dynamic graph link prediction is widely utilized in the complex web of the real world, such as social networks, citation networks, recommendation systems, etc. Recent Transformer-based link prediction methods on dynamic graphs not only fail to model the fine-grained structures such as triangles with the vanilla Transformers in the graph serialization process, but also amplify the imbalanced distribution of graphs because of their over-estimation of high-degree nodes. To tackle these issues, we propose a Topology-aware Transformer on Dynamic Graph (TopDyG) for link prediction, consisting of a topology injected Transformer (Ti-Transformer) and a mutual information learning (Mi-Learning). % mainly consisting of two components, i.e., the topology-injected (Ti) Transformer and the mutual information (Mi) learning module. The Ti-Transformer explores the explicit structure of serialized graphs, capturing the topological features. The Mi-Learning mines the relationship between nodes by modeling the mutual information with a prior knowledge, alleviating the over-estimation of high-degree nodes when applying the Transformer-based models for the dynamic graph link prediction task. Extensive experiments on four public datasets containing both transductive and inductive settings present the superiority of our proposal. In particular, TopDyG presents an improvement of 43.27% and 28.75% against the state-of-the-art baselines in terms of NDCG and Jaccard, respectively. The advantages are especially obvious on the high-density graphs.
Submission Number: 1108
Loading