CoCon: A Self-Supervised Approach for Controlled Text GenerationDownload PDF

Published: 12 Jan 2021, Last Modified: 22 Oct 2023ICLR 2021 PosterReaders: Everyone
Keywords: Language modeling, text generation, controlled generation, self-supervised learning
Abstract: Pretrained Transformer-based language models (LMs) display remarkable natural language generation capabilities. With their immense potential, controlling text generation of such LMs is getting attention. While there are studies that seek to control high-level attributes (such as sentiment and topic) of generated text, there is still a lack of more precise control over its content at the word- and phrase-level. Here, we propose Content-Conditioner (CoCon) to control an LM's output text with a content input, at a fine-grained level. In our self-supervised approach, the CoCon block learns to help the LM complete a partially-observed text sequence by conditioning with content inputs that are withheld from the LM. Through experiments, we show that CoCon can naturally incorporate target content into generated texts and control high-level text attributes in a zero-shot manner.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: We propose CoCon to control the content of text generation from LMs by conditioning on content inputs at an interleave layer.
Code: [![github](/images/github_icon.svg) alvinchangw/COCON_ICLR2021](https://github.com/alvinchangw/COCON_ICLR2021)
Data: [WebText](https://paperswithcode.com/dataset/webtext)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:2006.03535/code)
13 Replies

Loading