Knowledge Graph Completion as Tensor Decomposition: A Genreal Form and Tensor N-rank RegularizationDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Knowledge Graph, Tensor Decomposition, Low-rank Tensor Completion
Abstract: Knowledge graph completion (KGC) is a 3rd-order binary tensor completion task. Tensor decomposition based (TDB) models have shown great performance in KGC. In this paper, we summarize existing TDB models and derive a general form for them. Based on the general form, we show the principles of model design to satisfy logical rules. However, these models suffer from the overfitting problem severely. Therefore, we propose a regularization term based on the tensor $n$-rank which enforces the low-rankness of the tensor. First, we relax the tensor $n$-rank to the sum of the nuclear norms of the unfolding matrix along each mode of the tensor. In order to be computationally efficient, we further give an upper bound of the sum of the nuclear norms. Finally, we use the upper bound as the regularization term to achieve low-rank matrix decomposition of each unfolding matrix. Experiments show that our model achieves state-of-the-art performance on benchmark datasets.
One-sentence Summary: We derive a general form for tensor decomposition based models and propose a regularization term based on the tensor n-rank.
Supplementary Material: zip
4 Replies

Loading