Query-Aware Dynamic Representation Learning for Temporal Knowledge Graph Reasoning

Published: 2025, Last Modified: 05 Dec 2025ISWC (1) 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Temporal Knowledge Graph (TKG) reasoning, which predicts future queries based on historical facts, has garnered significant attention. Existing methods learn dynamic representations of entities and then predict queries based on these representations. However, they primarily learn these representations from historical facts without considering the information in the queries. Actually, queries can highlight relevant historical facts, thereby enabling the model to learn more task-specific and accurate dynamic representations. Moreover, concurrent queries often exhibit structural dependencies and can facilitate more precise and coherent predictions. Motivated by these, we propose a Query-aware Dynamic Representation Learning (QDRL), a method that adaptively incorporates query information into the dynamic representation learning process. Specifically, to capture the structural dependencies among concurrent queries, QDRL employs a CNN-based query encoder to generate query representations, which are subsequently integrated into the dynamic entity representations via a Transformer-based temporal encoder. In addition, acknowledging that some static background knowledge of entities is not explicitly represented in TKG, we leverage Large Language Models (LLMs) to construct a comprehensive background knowledge graph. By modeling this graph, QDRL generates more informative initial representations of entities, leading to improved dynamic representations. TKG reasoning experiments on five benchmark datasets demonstrate the significant improvement of the proposed QDRL method, with up to 5.41% and 16.96% performance improvement in MRR on entity and relation prediction tasks, respectively, compared to the state-of-the-art baselines. The code and data are released here: https://github.com/vzhang-mm/QDRL.
Loading