Abstract: Multiple-choice question generation is one of the important applications in natural language processing and has been widely used for education applications. Distractors play essential role in multiple-choice question applications. Introducing external knowledge is an effective mechanism to improve the quality of distractors, and knowledge Graph is a widely used source of external knowledge. In the process of knowledge based distractor generation, there are two main problems: 1) lack of enough distractors in the subgraph extracted from the knowledge graph, and 2) incompleteness and sparsity of knowledge graph. In this paper we introduce two optimization strategies to overcome these limitations, i.e., retention of easily confused distractors (RECD) and knowledge graph path generation (KGPG), thereby proposing a heterogeneous two-tower model (KG2MCQ) to match question-answers and distractors. We validate the effectiveness of our model and strategies by generating common sense multiple-choice questions on CommonsenseQA dataset, and the experimental results have shown our method’s promising potential.
Loading