Quantum and Translation Embedding for Knowledge Graph CompletionDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: quantum embedding, knowledge graph embedding, knowledge graph completion, logical rules mining, knowledge base
Abstract: Knowledge Graph Completion (KGC) mainly devotes to link predicting for an entity pair in Knowledge Graph (KG) according to known facts. In this work, we present a novel model for this end. In this model, Quantum and Translation Embedding are used as components for logical and structural feature capturing in the same vector subspace, respectively. The two components have synergy with each other and achieve impressive performance at low cost which is close to the efficient model TransE. Surprisingly, the performance on challenging datasets such as fb15k237 and WN18RR is up to 94.89% and 92.79% in metric Hits@1 while the dimension of embedding is only 4 in the process of training. The insight of this work enlightens the notion of dense feature model design for KGC which is a new alternative to Deep Neural networks (DNN) in this task or even a better choice.
One-sentence Summary: This work pushes the boundaries recent to the era of more than 90% in Hits@1 on all challenging datasets such as fb15k237, wn18rr and yago3-10.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=40sAQuDo3
5 Replies

Loading