Abstract: Answer natural language questions on knowledge bases ( $$\mathsf {KBQA}$$ ) has attracted wide attention. Several techniques have been developed for answering simple questions. These techniques mostly rely on deep networks to perform classification for relation prediction. Nowadays, contrastive learning has shown its powers in improving performances of classification, while most prior techniques do not gain benefit from this. In light of these, we propose a novel approach to answering simple questions on knowledge bases. Our approach has two key features. (1) It leverages pre-trained transformers to gain better performance on entity linking. (2) It employs a contrastive learning based model for relation prediction. We experimentally verify the performance of our approach, and show that our approach achieves an accuracy of 83.54%, which beats existing state-of-the-art techniques, on a typical benchmark dataset; we also conduct a deep analysis to show advantages of our technique, especially its sub-modules.
0 Replies
Loading