ShotSight: Explaining KGE Models with an LLM-Ready, Example-Based Heuristic

ICLR 2026 Conference Submission14879 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: XAI, link prediction, knowledge graph embeddings, CBR
Abstract: This article tackles the critical challenge of explainability in Knowledge Graph Embedding (KGE) models. We introduce a novel case-based reasoning approach called ShotSight, that leverages the latent space representation of nodes and edges in a knowledge graph to generate compelling, human-understandable, example-based explanations for link predictions. By analyzing the impact of identified triples on model performance, we demonstrate the effectiveness of our approach in generating explanations compared to random baselines. We evaluate our method on two publicly available datasets and show its superiority in terms of explanatory power for KGE models. Furthermore, we demonstrate the broader applicability of this technique, extending beyond traditional KGE explanations. Specifically, our method can serve as a valuable aid in constructing relevant “shots” for few-shot prompting within Large Language Models (LLMs) making KGE models LLM-ready.
Supplementary Material: zip
Primary Area: interpretability and explainable AI
Submission Number: 14879
Loading