Temporal Knowledge Extrapolation Based on Fine-Grained Tensor Graph Attention Network for Responsible AI

Published: 2025, Last Modified: 12 Nov 2025IEEE Trans. Artif. Intell. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Knowledge guidance is crucial for bridging the gap between high-level artificial intelligence (AI) ethics principles and the practical implementation of responsible AI systems. Diverging from static knowledge inference and temporal interpolation, the task of temporal knowledge extrapolation, which entails predicting future facts based on the evolution of historical facts, poses formidable challenges that remain largely unsolved. In existing temporal extrapolation methods, the structural learning of concurrent facts is primarily addressed by utilizing relation-aware graph neural networks. However, the temporal validity of facts leads to the sparsity of temporal subgraphs, where inherent coarse-level learning mechanisms hinder the capture of nuanced and scarce local semantics. Thus, this article proposes a fine-grained tensor graph attention network (F-GAT) that promotes representation learning of concurrent facts on sparse subgraphs by effectively distinguishing the significance of entities and relations within triplets and acquiring self-attention across diverse triplet contexts. Based on F-GAT, this article proposes a recurrent evolution network [recurrent evolution graph attention network (RE-GAT)] for temporal knowledge extrapolation. RE-GAT employed gated recurrent units to iteratively capture the sequential patterns between adjacent temporal facts, while simultaneously learning enriched embedding through the dual influence of historical factors and structural factors. The competitive results achieved by comparing the entity and relationship prediction performance of the proposed RE-GAT model with advanced methods on six public benchmarks demonstrate the effectiveness of RE-GAT for temporal extrapolation.
Loading