VOLTA: Diverse and Controllable Question Generation with Variational-Mutual-Information-Maximizing VAEDownload PDF

Anonymous

16 Oct 2022 (modified: 05 May 2023)ACL ARR 2022 October Blind SubmissionReaders: Everyone
Keywords: natural language generation, question generation, diverse generation, controllable generation
Abstract: Most recent natural language generation models only focus on the quality of the generated text, which is usually measured against a set of reference sentences. This causes the models to generate similar sentences given the same context and thus leads to low diversity in the generated content. In this paper, we propose a model named VOLTA that leverages the Variational Autoencoder framework to improve the diversity of large-scale language models. Unlike the prior attempts, we use a shared GPT-2 backbone network for both the encoder and the decoder because it has proved to be effective in both natural language understanding and generation. In addition, we propose to add latent codes that originated from InfoGAN to enable input-independent controllability. Our model architecture can be used for any typical language generation tasks, but we test it on the question-answer pair generation task as it has series of well-established evaluation metrics. Experimental results show that our model can significantly improve the generative diversity over previous models.
Paper Type: long
Research Area: Generation
0 Replies

Loading