Abstract: Knowledge graph completion can make the knowledge graph more complete. Unfortunately, most of existing methods on knowledge graph completion assume that the entities or relations in the knowledge graph have sufficient triple instances. However, there are a great deal of long-tail triples in general domains. Furthermore, it is challenging to obtain a large amount of high-quality annotation data in vertical domains. To address these issues, we propose a knowledge collaborative fine-tuning approach for low-resource knowledge graph completion. We leverage the structured knowledge to construct the initial prompt template and learn the optimal templates, labels and model parameters through a collaborative fine-tuning algorithm. Our method leverages the explicit structured knowledge in the knowledge graph and the implicit triple knowledge from the language model, which can be applied to the tasks of link prediction and relation extraction. Experimental results show that our approach can obtain state-of-the-art performance on three knowledge graph reasoning datasets and five relation extraction datasets.
0 Replies
Loading