A Step-Wise Weighting Approach for Controllable Text GenerationDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: language models, controllable text generation, text detoxification
Abstract: We study the problem of controllable text generation (CTG): steering a language model (LM) to generate text with a desired attribute. Many existing approaches either require extensive training/fine-tuning of the LM for each single attribute under control or are slow to generate text. To this end, we first propose a framework based on step-wise energy-based models (EBMs) that is efficient in sampling and flexible in a wide range of practical CTG scenarios. Indeed, a number of existing CTG methods are special instances of our framework with a specific EBM design. In different control scenarios, we then design the respective energy functions that strategically up- or down-weigh the probabilities of keywords associated with a certain control attribute at each generation step. In experiments, we show that our simple and efficient approach is surprisingly competitive against more computationally expensive strong baselines, and even achieving new state-of-the-art performances in several cases. Our framework also provides a tuning hyper-parameter that nicely trades off generation quality and control satisfaction, enabling practitioners to easily adjust it to meet their needs.
One-sentence Summary: We propose an efficient, effective, and flexible controllable text generation method by strategically reweighing keywords associated with the control attribute at each time step during generation.
13 Replies

Loading