Keywords: Knowledge graph embedding; Ricci flow; Geometric learning
TL;DR: We establish RicciKGE, a framework coupling embeddings with extended Ricci flow, proving curvature decay and convergence under heterogeneous knowledge graph geometries.
Abstract: Knowledge graph embedding (KGE) relies on the geometry of the embedding space to encode semantic and structural relations. Existing methods place all entities on one homogeneous manifold—Euclidean, spherical, hyperbolic, or their product/multi‑curvature variants, to model linear, symmetric, or hierarchical patterns. Yet a predefined, homogeneous manifold cannot accommodate the sharply varying curvature that real‑world graphs exhibit across local regions. Since this geometry is imposed a priori, any mismatch with the knowledge graph’s local curvatures will distort distances between entities and hurt the expressiveness of the resulting KGE. To rectify this, we propose RicciKGE to have the KGE loss gradient coupled with local curvatures in an extended Ricci flow such that entity embeddings co-evolve dynamically with the underlying manifold geometry towards mutual adaptation. Theoretically, when the coupling coefficient is bounded and properly selected, we rigorously prove that i) all the edge-wise curvatures decay exponentially, meaning that the manifold is driven toward the Euclidean flatness; and ii) the KGE distances strictly converge to a global optimum, which indicates that geometric flattening and embedding optimization are promoting each other. Experimental improvements on link prediction and node classification benchmarks demonstrate RicciKGE's effectiveness in adapting to heterogeneous knowledge graph structures.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 3076
Loading