On the Anatomy of Latent-variable Generative Models for Conditional Text GenerationDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Conditional text generation is a non-trivial task, which is until now predominantly performed with latent-variable generative models. In this work, we intend to explore several choices that are shown to affect the two essential aspects of model performance: expressivity and controllability. We propose to experiment with a series of latent-variable models built around simple design changes under a general unified framework, with a particular focus on prior distributions based on Energy-Based Models instead of the usual standard Gaussian. Our experiments validate the claim that this richer prior allows for a better representational power, but it exhibits difficult training. We provide a comprehensive analysis of these difficulties and a close comparison with recent work on EBM-based priors for conditional text generation.
Paper Type: long
0 Replies

Loading