Abstract: In knowledge graph reasoning, the existing graph attention mechanisms tend to distribute attention to certain high-frequency relations. In this work, we design a target relational attention-oriented reasoning model, which focuses more on the relations that match the target relation. We propose a hierarchical (node-level and relational subgraph-level) attention mechanism to aggregate the information of multi-hop neighbors, and to thereby obtain a better node-embedding representation, (with high-order propagation characteristics). The mechanism also relieves over-smoothing to a certain extent. Node-level information aggregation uses the classical graph-attention mechanism, and the distribution of attention in the subgraph-level information aggregation is determined according to the relation in the reasoning task. In other words, we must give these relations different attentions according to the reasoning relation in the task. Experiments show that our model significantly outperforms current state-of-the-art methods. We further study the influence of encoder parameters on the model performance; increasing the number of attention-heads, layers, or hidden out-dimensions can effectively improve the performance.
Loading