Abstract: Convolutional Neural Networks (CNNs) have demonstrated effectiveness in knowledge graph embedding, but existing CNN-based methods encounter two main challenges. Firstly, CNN-based models with simple architectures are unable to extract latent features. Secondly, these models enhance feature extraction by adding extra modules, which inevitably increases training cost with limited performance improvement. To address these challenges, we go beyond traditional CNNs and propose a novel knowledge graph embedding model, which utilizes the powerful capability of Quaternion Convolutional Neural Networks (QCNNs) for effective representation learning. Specifically, we learn representations of entities and relations in a quaternion space and utilize QCNN to extract the inherent information of the entity-relation matrix. We evaluate the performance of our model on multiple knowledge graph completion benchmark datasets. Experimental results show that our model achieves effective improvements compared to existing CNN-based models. Moreover, in terms of training time, our model is faster than other outstanding models. The code of all experiments is available at https://github.com/llqy123/ConvQE.
Loading