Controlled Text Generation as Continuous Optimization with Multiple ConstraintsDownload PDF

Published: 09 Nov 2021, Last Modified: 08 Sept 2024NeurIPS 2021 PosterReaders: Everyone
Keywords: controllable text generation, constrained optimization, style transfer
TL;DR: We present a method of controllable inference from pretrained language models via continuous optimization through gradient descent
Abstract: As large-scale language model pretraining pushes the state-of-the-art in text generation, recent work has turned to controlling attributes of the text such models generate. While modifying the pretrained models via fine-tuning remains the popular approach, it incurs a significant computational cost and can be infeasible due to a lack of appropriate data. As an alternative, we propose \textsc{MuCoCO}---a flexible and modular algorithm for controllable inference from pretrained models. We formulate the decoding process as an optimization problem that allows for multiple attributes we aim to control to be easily incorporated as differentiable constraints. By relaxing this discrete optimization to a continuous one, we make use of Lagrangian multipliers and gradient-descent-based techniques to generate the desired text. We evaluate our approach on controllable machine translation and style transfer with multiple sentence-level attributes and observe significant improvements over baselines.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
Supplementary Material: pdf
Code: https://github.com/Sachin19/mucoco/
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/controlled-text-generation-as-continuous/code)
14 Replies

Loading