Abstract: Knowledge graph embedding techniques have
emerged as a critical approach for addressing
the issue of missing relations in knowledge
graphs. However, existing methods often suffer from limitations, including high intra-group
similarity, loss of semantic information, and
insufficient inference capability, particularly in
complex relation patterns such as 1-N and N-1
relations. To address these challenges, we introduce a novel KGE framework that leverages
mutual information maximization to improve
the semantic representation of entities and relations. By maximizing the mutual information between different components of triples,
such as (h, r) and t, or (r, t) and h, the proposed method improves the model’s ability to
preserve semantic dependencies while maintaining the relational structure of the knowledge graph. Extensive experiments on benchmark datasets demonstrate the effectiveness
of our approach, with consistent performance
improvements across various baseline models.
Additionally, visualization analyses and case
studies demonstrate the improved ability of the
MI framework to capture complex relation patterns.
Loading