Abstract: Temporal knowledge graphs (TKGs) extrapolation reasoning, intending to predict future events given the known
KG sequence, benefits broad applications like policy-making and financial analysis. The key to this issue is to
discern how knowledge evolves within these sequences. Currently, most works focus on modeling the evolution
patterns through continuous sampling from TKGs, without ensuring the samples contain relevant facts or considering the knowledge beyond the samples. Faced with these challenges, we propose a novel model that performs
prediction by capturing fact and logic knowledge evolution patterns (FL-Evo). For modeling fact evolution pattern, the fact knowledge is first distilled from large language models using designed prompts and subsequently
refined with TKG. Then, entity-based subgraph sampling strategy extracts relevant facts from the TKG, capturing
fact evolution patterns. Furthermore, logical knowledge mined from the TKG helps to derive the corresponding
evolution pattern. Finally, the outputs of these two evolution patterns are integrated to realize the final prediction. Experimental results on five benchmark datasets demonstrate that FL-Evo outperforms existing temporal
knowledge graph reasoning models, with improvements of up to 3.97 % in Hit@3 and 4.07 % in Hit@10. Notably,
FL-Evo substantially enhances reasoning performance for unseen entities lacking prior records.
Loading