Abstract: Knowledge graph completion is a critical task in natural language processing. The task becomes more challenging on temporal knowledge graph, where each fact is associated with a timestamp. Currently, cognitive science has revealed that the time-dependent historical experience can activate the neurons, and time-related and the static information should be fused to represent the happened facts. Meanwhile, there are correspondence between the CNN model and the biological cortex in several aspects, correspondingly, different levels of cortex information can be described using different sizes of convolution kernels. Most existing methods for temporal knowledge graph completion learn the time-varying relation embeddings by scaling with the number of entities or timestamps, and then use the dot production between the embeddings of entities and relations as the quadruple’s loss. However, the dot product cannot well describe the complex interaction between the embeddings. Inspired by this theory, this paper proposes multi-scale convolutional neural network (MsCNN), which utilizes both static and dynamic information to represent the relations’ embeddings, and uses convolution operation to learn the mutual information between the embeddings of time-varying relations and entities. Besides, multi-scale convolution kernels are utilized to learn the mutual information at different levels. We also verified that with the increase of the dimension of embeddings, the performance increases. The performance of MsCNN on three benchmark datasets achieves state-of-the-art link prediction results. The MsCNN can well fuse the static and temporal information and explore different levels of mutual information between the input embeddings.
Loading