Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph CompletionDownload PDF

Anonymous

17 Apr 2023ACL ARR 2023 April Blind SubmissionReaders: Everyone
Abstract: Temporal Knowledge Graph Completion (TKGC) under extrapolation setting aims to predict the missing entity from a fact in the future, which is challenging and more aligned with the real-world prediction issue. Most existing research encodes the entities and relations via applying a sequential graph neural network on the recent snapshots. However, they tend to not consider skipping the irrelevant snapshots according to the entity-related relation in the query and neglect the importance of explicit temporal information. Motivated by this, we proposed our model, Re-Temp (Relation-Aware Temporal Representation Learning), which applies explicit temporal embedding as the input and a skip information flow after each timestamp to skip the unnecessary information for prediction. In addition to this, we propose a two-phase forward propagation method to avoid information leakage. We evaluated our model on six TKGC (extrapolation) datasets and found that it significantly outperformed all eight recent state-of-the-art models.
Paper Type: long
Research Area: NLP Applications
0 Replies

Loading