KG-TRICK: Unifying Textual and Relational Information Completion of Knowledge for Multilingual Knowledge Graphs

ACL ARR 2024 June Submission4671 Authors

16 Jun 2024 (modified: 02 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Multilingual knowledge graphs (KGs) provide high-quality relational and textual information for various NLP applications but they are often incomplete, especially in non-English languages. Previous research has shown that combining information from several knowledge graphs in different languages aids both Knowledge Graph Completion (KGC), the task of predicting of missing relations between entities, and Knowledge Graph Enhancement (KGE), the task of predicting missing textual information for entities. While previous efforts have considered KGC and KGE as independent tasks, we hypothesize that they are interdependent and mutually beneficial. To this end, we introduce KG-TRICK, a novel sequence-to-sequence framework that unifies the tasks of textual and relational information completion for multilingual knowledge graphs. KG-TRICK demonstrates that i) it is possible to unify the tasks of KGC and KGE into one single framework, and ii) combining textual information from multiple languages is beneficial to improve the completeness of a KG. As part of our contributions, we also introduce WikiKGE++, the largest manually-curated benchmark for textual information completion of KGs, which features over 30,000 instances across 10 diverse languages.
Paper Type: Long
Research Area: Multilingualism and Cross-Lingual NLP
Research Area Keywords: multilingual benchmarks, cross-lingual transfer, multilingualism
Contribution Types: NLP engineering experiment, Publicly available software and/or pre-trained models, Data resources
Languages Studied: Arabic,German,English,Spanish,French,Italian,Japanese,Korean,Chinese
Submission Number: 4671
Loading