ReadE: Learning Relation-Dependent Entity Representation for Knowledge Graph CompletionDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Conventional knowledge graph embedding methods learn semantic representations for entities considering their intrinsic interactions through powerful graph neural networks. However, previous methods represent each node solely with a coarse-grained unique representation, regardless of the variance of emphasis of entity semantics by different relations. To tackle this problem, we propose ReadE, a method to learn relation-dependent entity representations of which the semantic information is emphasized by varied relations types. First, we propose a relation-controlled gating mechanism targeting on utilizing the relation to control the information flow in the aggregation step of the graph neural network. Second, we propose a contrastive learning method with mixing both relation-level and entity-level negative samples to enhance semantics preserved in relation-dependent entity representations. Experiments on three benchmarks show that our proposed model outperforms all strong baselines. The code will be made open-sourced on Github.
0 Replies

Loading