A Simple and Effective Model for Multi-Hop Question GenerationDownload PDF

Anonymous

17 Sept 2021 (modified: 05 May 2023)ACL ARR 2021 September Blind SubmissionReaders: Everyone
Abstract: Previous research on automated question generation has almost exclusively focused on generating factoid questions whose answers can be extracted from a single document. However, there is an increasing interest in developing systems that are capable of more complex multi-hop question generation (QG), where answering the question requires reasoning over multiple documents. In this work, we propose a simple and effective approach based on the transformer model for multi-hop QG. Our approach consists of specialized input representations, a supporting sentence classification objective, and training data weighting. Prior work on multi-hop QG considers the simplified setting of shorter documents and also advocates the use of entity-based graph structures as essential ingredients in model design. On the contrary, we showcase that our model can scale to the challenging setting of longer documents as input, does not rely on graph structures, and substantially outperforms the state-of-the-art approaches as measured by automated metrics and human evaluation.
0 Replies

Loading