Distractor Generation for Multiple-Choice Questions: A Survey of Methods, Datasets, and EvaluationDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Distractors as part of multiple-choice question (MCQ) are vital in learning evaluation and are commonly used in education across a variety of domains such as Science, English, and Mathematics. The advancement of artificial intelligence (AI) has enabled the Distractor Generation (DG) problem to progress from traditional methods into advanced neural networks and pre-trained models. This survey paper reviews DG tasks using English MCQ datasets for textual and multi-modal contexts. In particular, this paper presents a thorough literature review of the recent methods on DG tasks, discusses multiple choice components and their characteristics, analyzes the related datasets, summarizes the evaluation metrics, indicates current findings noticed from exiting benchmarks and methods, and highlights the challenges and open issues.
Paper Type: long
Research Area: Generation
Contribution Types: Surveys
Languages Studied: CLOTH (CLOTH-M, CLOTH-H), SCDE, DGen, CELA, SciQ, AQUA-RAT, OpenBookQA, ARC (ARC-Challange, ARC-Easy) , MCQL, CommonSenseQA, MathQA, QASC, MedMCQA, Televic, EduQG, ChidrenBookTest, Who Did What, MCTest (MCTest-160, MCTest-500), RACE (RACE-M, RACE-H), RACE-C, DREAM, CosmosQA, ReClor, QuAIL, MovieQA, Visual7W, TQA, RecipeQA, ScienceQA.
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview