SPBERT: an Efficient Pre-training BERT on SPARQL Queries for Question Answering over Knowledge Graphs
Abstract: Knowledge Graph is becoming increasingly popular and necessary during the past years. In order to address the lack of structural information of SPARQL query language, we propose SPBERT, a transformer-based language model pre-trained on massive SPARQL query logs. By incorporating masked language modeling objectives and the word structural objective, SPBERT can learn general-purpose representations in both natural language and SPARQL query language. We investigate how SPBERT and encoder-decoder architecture can be adapted for Knowledge-based QA corpora. We conduct exhaustive experiments on two additional tasks, including SPARQL Query Construction and Answer Verbalization Generation. The experimental results show that SPBERT can obtain promising results, achieving state-of-the-art BLEU scores on several of these tasks.
0 Replies
Loading