Keywords: Dynamic Graph Representation Learning; Graph Neural Networks; Temporal Motifs
Abstract: Although exsisting continuous-time dynamic graph representation learning methods have achieved great success by constructing extra features based on temporal motifs, they fail to fully leverage the spatial and temporal structures of temporal motifs to model the high-order correlations among nodes. An intuitive and innovative approach is to incorporate temporal motifs into message passing networks, which encounters the following challenges: (1) diverse semantics exhibit in a temporal motif; (2) complex high-order correlations exist among the constituent edges of temporal motifs. To this end, we propose a Temporal Motif Message Passing network (TMMP) for dynamic graph representation learning. Specifically, a temporal motif extension module is proposed to obtain extended temporal motifs based on different time window lengths. Then, an attentive temporal motif encoder is proposed to capture the diverse semantics of temporal motifs. In addition, a dual hypergraph temporal motif message passing mechanism is proposed to comprehensively model the complex relationships among the constituent edges. Extensive experiments on four real-world datasets demonstrate that TMMP achieves the state-of-the-art performance, surpassing the best baseline methods in both AP and AUC metrics on almost all datasets.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 5173
Loading