Record-to-Text Generation with Style ImitationDownload PDF

Anonymous

17 Jun 2020 (modified: 17 Jun 2020)OpenReview Anonymous Preprint Blind SubmissionReaders: Everyone
Keywords: Natural Language Generation, Record-to-Text Generation, Style Imitation
TL;DR: A new way for text generation that learns to imitate the writing style of any given exemplar sentence, with automatic adaptions to faithfully describe the record.
Abstract: Recent neural approaches to record-to-text generation have mostly focused on improving content fidelity while lacking explicit control over writing styles (e.g., sentence structures, word choices). More traditional systems use templates to determine the realization of text. Yet manual or automatic construction of high-quality templates is difficult, and a template acting as hard constraints could harm content fidelity when it does not match the record perfectly. We study a new way of stylistic control by using existing sentences as “soft” templates. That is, a model learns to imitate the writing style of any given exemplar sentence, with automatic adaptions to faithfully describe the record. The problem is challenging due to the lack of parallel data. We develop a neural approach that includes a hybrid attention-copy mechanism, learns with weak supervisions, and is enhanced with a new content coverage constraint. We conduct experiments in restaurants and sports domains. Results show our approach achieves stronger performance than a range of comparison methods. Our approach balances well between content fidelity and style control given exemplars that match the records to varying degrees.
0 Replies

Loading