Alleviating the Sparsity of Open Knowledge Graphs with Pretrained Contrastive LearningDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Due to the sparsity of formal knowledge and the roughness of non-ontological construction methods, relevant facts are often missing in Open Knowledge Graphs (OpenKGs). Although existing completion methods have achieved promising performance, they do not alleviate the sparsity problem of OpenKGs. Owing to fewer training chances caused by sparse links, many few-shot and zero-shot entities cannot fully learn high-dimensional features. In this paper, we propose a new OpenKG Contrastive Learning (OKGCL) model to alleviate the sparsity with contrastive entities and relations. OKGCL designs (a) negative entities to discriminate different entities with the same relation, (b) negative relations to discriminate different relations with the same entity-pair, and (c) \emph{self} positive samples to give zero-shot and few-shot entities chances to learn discriminative representations. Extensive experiments on benchmark datasets show the superiority of OKGCL over state-of-the-art models.
0 Replies

Loading