Disentangled Relational Graph Neural Network with Contrastive Learning for knowledge graph completion

Published: 01 Jan 2024, Last Modified: 13 Nov 2024Knowl. Based Syst. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Learning disentangled entity representations has garnered significant attention in the field of knowledge graph completion (KGC). However, the existing methods inherently overlook the indicative role of relations and the correlation between latent factors and relations, leading to suboptimal entity representations for KGC tasks. In the current study, we introduce the Disentangled Relational Graph Neural Network with Contrastive Learning (DRGCL) method, designed to acquire disentangled entity representations guided by relations. In particular, we first devise the factor-aware relational message aggregation approach to learn entity representations under each semantic subspace and obtain latent factor representations by attention mechanisms. Subsequently, we propose a discrimination objective for factor-subspace pairs using a contrastive learning approach, which compels the factor representations to distinctly capture the information associated with different latent factors and promote the consistency between factor representations and semantic subspaces. Through disentanglement, our model can generate relation-aware scores tailored to the provided scenario. Extensive experiments have been conducted on three benchmark datasets and the results demonstrate the superiority of our method compared with strong baseline models.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview