Abstract: Existing knowledge graph embedding methods that adopt powerful graph neural networks try to aggregate well-preserved neighborhood information into the entity representation. However, they represent each entity solely with a relation-irrespective representation which contains the entire miscellaneous neighborhood information, regardless of the variance of emphatic semantics required by different relations in predicting the missing entities. To tackle this problem, we propose ReadE, a method to learn relation-dependent entity representation, of which the neighborhood information is selectively aggregated and emphasized by varied relations types. First, we propose a relation-controlled gating mechanism targeting on utilizing the relation to control the information flow from neighbors in the aggregation step of the graph neural network. Second, we propose a well-designed contrastive learning method with mixing both relation-level and entity-level negative samples to enhance semantics preserved in our relation-dependent GNN-based representations. Experiments on three benchmarks show that our proposed model outperforms all strong baselines. The code will be made open-sourced on Github.
Paper Type: long
0 Replies
Loading