GLTW: Joint Improved Graph Transformer and LLM via Three-Word Language for Knowledge Graph Completion

ACL ARR 2025 February Submission3597 Authors

15 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract:

Knowledge Graph Completion (KGC), which aims to infer missing or incomplete facts, is a crucial task for KGs. However, integrating the vital structural information of KGs into Large Language Models (LLMs) and outputting predictions deterministically remains challenging. To address this, we propose a new method called \textbf{GLTW}, which encodes the structural information of KGs and merges it with LLMs to enhance KGC performance. Specifically, we introduce an improved Graph Transformer (\textbf{iGT}) that effectively encodes subgraphs with both local and global structural information and inherits the characteristics of language model, bypassing training from scratch. Also, we develop a subgraph-based multi-classification training objective, using all entities within KG as classification objects, to boost learning efficiency. Importantly, we combine iGT with an LLM that takes KG language prompts as input. Our extensive experiments on various KG datasets show that GLTW achieves significant performance gains compared to SOTA baselines.

Paper Type: Long
Research Area: Information Extraction
Research Area Keywords: Knowledge Graph Completion, Graph Transformer, Large Language Models,
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 3597
Loading