Advancing Precise Outline-Conditioned Text Generation with Task Duality and Explicit Outline ControlDownload PDF

Anonymous

16 Oct 2023ACL ARR 2023 October Blind SubmissionReaders: Everyone
Abstract: Existing works on outline-conditioned text generation typically aim to generate text using provided outlines as rough sketches, such as keywords and phrases. However, these approaches make it challenging to control the quality of text generation and assess consistency between outlines and generated texts due to lack of clarity and rationality of the rough outlines. In this paper, we introduce a novel text generation task called \textbf{\textit{Precise Outline-conditioned Generation}}, which requires generating stories based on \textit{specific}, \textit{sentence-level} outlines. To facilitate research on this task, we construct two new datasets, \textbf{WPOG} and \textbf{CDM}. We provide strong baselines based on fine-tuning models such as BART and GPT-2, and evaluating zero-shot performance of models such as ChatGPT and Vicuna. Furthermore, we identify an issue of \textbf{imbalanced utilization of the outline information} in precise outline-conditioned generation, which is ubiquitously observed across fine-tuned models and zero-shot inference models. To address this issue, we propose an \textbf{explicit outline utilization control approach} and a novel framework that \textbf{leverages the task duality between summarization and generation}. Experimental results show that the proposed approaches effectively alleviate the issue of imbalanced outline utilization and enhance the quality of precise outline-conditioned text generation for both fine-tuning and zero-shot settings.
Paper Type: long
Research Area: Generation
Contribution Types: NLP engineering experiment, Data resources
Languages Studied: English
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies

Loading