Abstract: Pre-trained language models (PLMs) have achieved remarkable knowledge graph completion(KGC) success. However, previous methods derive KGC results mainly from triple-level and text-described learning, which lack the capability of capturing long-term relational and structural information. Moreover, the absence of a visible reasoning process leads to poor interpretability and credibility of the completions. In this paper, we propose a path-enhanced pre-trained language model-based knowledge graph completion method (PEKGC), which employs multi-view generation to infer missing facts in triple-level and path-level simultaneously to address lacking long-term relational information and interpretability issues. Furthermore, a neighbor selector module is proposed to filter neighbor triples to provide the adjacent structural information. Finally, we propose a fact-level re-evaluation and a heuristic fusion ranking strategy for candidate answers to fuse multi-view predictions. Extensive experiments on the benchmark datasets demonstrate that our model significantly improves the performance of the KGC task.
Paper Type: Long
Research Area: Information Extraction
Research Area Keywords: Knowledge Graph, Knowledge Graph Completion, Link Prediction
Contribution Types: Model analysis & interpretability
Languages Studied: English
Submission Number: 5934
Loading