GenTKG: Generative Forecasting on Temporal Knowledge Graph

Published: 20 Oct 2023, Last Modified: 24 Nov 2023TGL Workshop 2023 LongPaperEveryoneRevisionsBibTeX
Keywords: Generative Forecasting, Temporal Knowledge Graph Forecasting, Large Language Model
TL;DR: We find that LLMs can understand structured temporal relational data and serve as the foundation model for temporal relational forecasting.
Abstract: The rapid advancements in large language models (LLMs) have ignited interest in the temporal knowledge graph (tKG) domain, where conventional carefully designed embedding-based and rule-based models dominate. The question remains open of whether pre-trained LLMs can understand structured temporal relational data and replace them as the foundation model for temporal relational forecasting. Therefore, we bring temporal knowledge forecasting into the generative setting. However, challenges occur in the huge chasms between complex temporal graph data structure and sequential natural expressions LLMs can handle, and between the enormous data sizes of tKGs and heavy computation costs of finetuning LLMs. To address these challenges, we propose a novel retrieval augmented generation framework named GENTKG combining a temporal logical rule-based retrieval strategy and lightweight few-shot parameter-efficient instruction tuning to solve the above challenges. Extensive experiments have shown that GENTKG outperforms conventional methods of temporal relational forecasting under low computation resources with extremely limited training data as few as 16 samples. GENTKG also highlights remarkable cross-domain and in-domain generalizability with outperforming performance on unseen datasets without re-training. Our work reveals the huge potential of LLMs in the tKG domain and opens a new frontier for generative forecasting on tKGs.
Format: Long paper, up to 8 pages. If the reviewers recommend it to be changed to a short paper, I would be willing to revise my paper to fit within 4 pages.
Submission Number: 47
Loading