FHDM-KGE: Fuzzy Hierarchical Modeling and Dual Mixture-of-Experts for Knowledge Graph Embedding

ICLR 2026 Conference Submission786 Authors

02 Sept 2025 (modified: 23 Dec 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Knowledge graph embedding, fuzzy hierarchical modeling, dual mixture of experts
Abstract: Real world knowledge graphs (KGs) exhibit rich hierarchical structures, and effectively modeling such structures is crucial for learning high-quality representations and boosting downstream reasoning performance. However, existing hierarchy-aware KGE methods suffer from two key limitations: (i) hard layer assignment inevitably causes information loss for boundary or multi-role entities, and (ii) the neglect of relational cross-layer differences restricts the expressiveness of relation embeddings. To overcome these issues, we propose FHDM-KGE, a Fuzzy Hierarchical Modeling with Dual Mixture-of-Experts framework for knowledge graph embedding (KGE). First, we introduce a differentiable SpringRank-based fuzzy hierarchy that assigns entities to multiple layers with soft memberships, preserving multi-level semantics. Then, we design a dual MoE architecture: an entity-side MoE (EMoE) module gated by fuzzy memberships to capture intra-layer nuances, and a relation-side MoE (EMoE) module guided by head–tail hierarchical differences to model cross-layer relational patterns. The resulting entity and relation embeddings are scored with a ConvE decoder. Experiments on multiple public benchmarks demonstrate that FHDM-KGE consistently outperforms strong baselines, validating the effectiveness of combining fuzzy hierarchical modeling with dual MoE specialization.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 786
Loading