Learning Dynamic Graph Embeddings Using Random Walk With Temporal BacktrackingDownload PDF

24 Sept 2022 (modified: 05 May 2023)TGL@NeurIPS2022 LongPaperReaders: Everyone
Keywords: Dynamic Graph Embedding, Graph Representation Learning, Temporal Graphs, Graph Retrieval
TL;DR: In this paper, we propose a novel temporal graph-level embedding method
Abstract: Representation learning on graphs (also referred to as network embedding) can be done at different levels of granularity, from node to graph level. The majority of work on graph representation learning focuses on the former, and while there has been some work done on graph-level embedding, these typically deal with static networks. However, learning low-dimensional graph-level representations for dynamic (i.e., temporal) networks is important for such downstream graph retrieval tasks as temporal graph similarity ranking, temporal graph isomorphism, and anomaly detection. In this paper, we propose a novel temporal graph-level embedding method to fill this gap. Our method first builds a multilayer graph and then utilizes a novel modified random walk with temporal backtracking to generate temporal contexts for the nodes in the graph. Finally, a ``document-level'' language model is learned from these contexts to generate graph-level embeddings. We evaluate our model on five publicly available datasets for two commonly used tasks of graph similarity ranking and anomaly detection. Our results show that our method achieves state-of-the-art performance compared to all prior baselines.
Paper Format: full paper (8 pages)
0 Replies