Consecutive Question Generation with Multitask Joint Reranking and Dynamic Rationale SearchDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Automatic question generation (QG) aims to generate a set of questions for a given passage, and can be viewed as a dual task of question answering (QA). However, most current methods of QG tend to generate question by question independently, mainly based on specific extracted answer spans. In this paper, we propose to consecutively generate questions over a whole passage, with a comprehensive consideration of the aspects including accuracy, diversity, informativeness, and coverage. First we exam four key elements in QG, i.e., question, answer, rationale, and context history, and propose a novel multitask framework with one main task generating a question-answer pair, and four auxiliary tasks generating other elements alternately, improving model performance from all aspects through both joint training and reranking. Further, to learn the connection between questions and fully exploit the important information in every sentence, we propose a new consecutive generation strategy, which dynamically selects the rationales and searches for the best question series globally. Extensive experiments on different datasets show that our method can improve question generation significantly and benefit multiple related NLP tasks.
Paper Type: long
0 Replies

Loading