Abstract: While knowledge graphs contain rich semantic knowledge about various entities and the relational information among them, temporal knowledge graphs (TKGs) describe and model the interactions of the entities over time. In this context, automatic temporal knowledge graph completion (TKGC) has gained great interest. Recent TKGC methods aim to integrate advanced deep learning techniques, e.g., Transformers, to boost model performance. However, we find that instead of adopting various kinds of complex modules, it is more beneficial to capture more extensive temporal information. In this paper, we propose a simple but powerful graph encoder for TKGC, namely, TARGCN. TARGCN is parameter-efficient, and it extensively utilizes the information from the whole temporal context. We perform experiments on three benchmark datasets. Our model can achieve a more than 46% relative improvement on the GDELT dataset compared with state-of-the-art models. Meanwhile, it outperforms the strongest baseline on the ICEWS05-15 dataset with around 18% fewer parameters.
Paper Type: long
0 Replies
Loading