Convolutional Complex Knowledge Graph EmbeddingsDownload PDF

Published: 23 Feb 2021, Last Modified: 26 Mar 2024ESWC 2021 ResearchReaders: Everyone
Keywords: Knowledge graph embeddings, Convolutions, Complex numbers, Hermitian inner product
Abstract: We investigate the problem of learning continuous vector representations of knowledge graphs for predicting missing links. Recent results suggest that using a Hermitian inner product on complex-valued embeddings or convolutions on real-valued embeddings can be effective means for predicting missing links. We bring these insights together and propose ConEx---a multiplicative composition of a 2D convolution with a Hermitian inner product on complex-valued embeddings. ConEx utilizes the Hadamard product to compose a 2D convolution followed by an affine transformation with a Hermitian inner product in $\mathbb{C}$. This combination endows ConEx with the capability of (1) controlling the impact of the convolution on the Hermitian inner product of embeddings, and (2) degenerating into ComplEx if such a degeneration is necessary to further minimize the incurred training loss. We evaluated our approach on five of the most commonly used benchmark datasets. Our experimental results suggest that ConEx outperforms state-of-the-art models on four of the five datasets w.r.t. Hits@1 and MRR even without extensive hyperparameter optimization. Our results also indicate that the generalization performance of state-of-the-art models can be further increased by applying ensemble learning. We provide an open-source implementation of our approach, including training and evaluation scripts as well as pretrained models.
Subtrack: Machine Learning
First Author Is Student: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/arxiv:2008.03130/code)
14 Replies

Loading