Pretrained Transformers for Simple Question AnsweringDownload PDF

Anonymous

02 May 2019 (modified: 28 Jun 2019)OpenReview Anonymous Preprint Blind SubmissionReaders: Everyone
Abstract: Answering simple questions over knowledge graphs is a well-studied problem in question answering. Previous approaches for this task built on recurrent and convolutional neural networks (RNNs and CNNs) based architectures that use pretrained word embeddings. It was recently shown that a pretrained transformer network (BERT) can outperform RNN- and CNN based approaches on various natural language processing tasks. In this work, we investigate how well network BERT performs on the entity span prediction and relation prediction subtasks of simple QA. In addition, we provide an evaluation of both BERT and BiLSTM-based models in datasparse scenarios.
Keywords: question answering, BERT, transformer
0 Replies

Loading