LANTERN in the Event Stream: Training-Free Temporal Knowledge Graph Forecasting by Balancing Inertia and Shifts
Keywords: Temporal Knowledge Graph Forecasting; Training-Free Prompting; Large Language Models; In-Context Learning; Evidence Selection; Non-Stationarity and Regime Shifts
Abstract: Temporal knowledge graph forecasting (TKGF) ranks plausible future entities for a timestamped query based on historical events. Recent work shows that large language models (LLMs) can perform TKGF in a training-free manner through in-context learning, but performance is highly sensitive to the selection of historical evidence and the way it is organized under a fixed context budget. Therefore, we present LANTERN, a training-free prompting framework that explicitly balances interaction inertia and regime shifts using fixed-count windows: a smoothed long-window strength prior and a Beta-Binomial novelty score from a short window relative to that prior. We build a compact, diverse evidence set with an LLM-based usefulness gate, Pareto filtering, and constrained greedy selection, and retrieve one structure-aware analogical demonstration. Across ICEWS14, ICEWS05-15, ICEWS18, and GDELT, LANTERN consistently outperforms the state-of-the-art training-free baseline (Tang et al., 2025) under the same backbone, improving Hits@1 by up to 2.5 points and MRR by up to 1.2 points.
Paper Type: Long
Research Area: Machine Learning for NLP
Research Area Keywords: graph-based methods; knowledge-augmented methods; structured prediction; few-shot learning; generalization
Contribution Types: NLP engineering experiment, Approaches low compute settings-efficiency
Languages Studied: Python
Submission Number: 5294
Loading