Retrieval and Reasoning on KGs: Integrate Knowledge Graphs into Large Language Models for Complex Question Answering
Abstract: Despite Large Language Models (LLMs) have performed impressively in various Natural Language Processing (NLP) tasks, their inherent hallucination phenomena severely challenge their credibility in complex reasoning.
Combining explainable Knowledge Graphs (KGs) with LLMs is a promising path to address this issuse.
However, structured KGs are difficult to utilize, and how to make LLMs understand and incorporate them is a challenging topic.
We thereby reorganize a more efficient structure of KGs, while designing the KG-related instruction tuning and continual pre-training strategies to enable LLMs to learn and internalize this form of representation effectively.
Moreover, we construct subgraphs to further enhance retrieval capabilities of KGs via CoT reasoning.
Extensive experiments on two KGQA datasets demonstrate that our model achieve convincing performance compared to strong baselines\footnote{All the code, data and models will be publicly available at \url{https://anonymous.com}}.
Paper Type: Long
Research Area: NLP Applications
Research Area Keywords: knowledge graphs
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 3466
Loading