Mix and Match: Learning-free Controllable Text Generationusing Energy Language ModelsDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Due to the unidirectional nature of prevalent autoregressive generation models, recent work on controlled generation based on global text attributes has either required attribute-based fine-tuning of the base language model, or restricted the parametrization of the attribute prediction model to be compatible with the base LM. In this work, we propose Mix and Match LM, a global score-based alternative to controllable text generation that combines arbitrary pretrained blackbox models for the desired attributes without involving any fine-tuning or structural assumptions about the blackbox models. We interpret the task of controllable generation as drawing samples from an energy-based model whose energy values are a linear combination of scores from blackbox models that are separately responsible for fluency, the control attribute, and faithfulness to any conditioning context. We use a Metropolis Hastings sampling scheme to sample from this energy-based model using bidirectional context and global attribute features. We validate the effectiveness of our approach on various conditional generation and style transfer tasks by outperforming recently proposed methods that involve extra training, finetuning, or restrictive assumptions over the form of models.
0 Replies

Loading