AdaE: Knowledge Graph Embedding With Adaptive Embedding Sizes

Published: 2025, Last Modified: 21 Jan 2026IEEE Trans. Knowl. Data Eng. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Knowledge Graph Embedding (KGE) aims to learn dense embeddings as the representations for entities and relations in KGs. Indeed, the entities in existing KGs suffer from the data imbalance issue, i.e., there exists a substantial disparity in the occurrence frequencies among various entities. Existing KGE models pre-define a unified and fixed dimension size for all entity embeddings. However, embedding sizes of entities are highly desired for their frequencies, while a uniform embedding size may result in inadequate expression of entities, i.e., leading to overfitting for low-frequency entities and underfitting for high-frequency ones. A straight-forward idea is to set the embedding sizes for each entity before KGE training. However, manually selecting different embedding sizes is labor-intensive and time-consuming, posing challenges in real-world applications. To tackle this problem, we propose AdaE, which adaptively learns KG embeddings with different embedding sizes during training. In particular, AdaE is capable of selecting appropriate dimension sizes for each entity from a continuous integer space. To this end, we specially tailor bilevel optimization for the KGE task, which alternately learns representations and embedding sizes of entities. Our framework is general and flexible, fitting various existing KGE models. Extensive experiments demonstrate the effectiveness and compatibility of AdaE.
Loading