ABEX: Generative Data Augmentation for Low-Resource NLP via Expanding Abstract DescriptionsDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
Abstract: We present \textbf{ABEX}, a novel and effective generative data augmentation methodology for low-resource NLP. ABEX is based on \textbf{AB}stract-and-\textbf{EX}pand, a novel paradigm for generating diverse forms of an input document -- we first convert a document into its concise, abstract description and then generate new documents based on expanding the resultant abstraction. To learn the task of expanding abstract descriptions, we first train BART on a large-scale synthetic dataset with abstract-document pairs. Next, to generate abstract descriptions for a document, we propose a simple, controllable, and training-free method based on editing AMR graphs. ABEX brings the best of both worlds: by expanding from abstract representations, it preserves the original semantic properties of the documents, like style and meaning, thereby maintaining alignment with the original label and data distribution. At the same time, the fundamental process of elaborating on abstract descriptions facilitates diverse generations. We demonstrate the effectiveness of ABEX on various low-resource (data-scarce) NLP tasks on 12 datasets across 4 NLU tasks under 4 low-resource settings. ABEX outperforms all our baselines both qualitatively with improvements of 0.04\% - 38.8\%. Qualitatively, ABEX outperforms all prior methods from literature in terms of context and length diversity.
Paper Type: long
Research Area: Efficient/Low-Resource Methods for NLP
Contribution Types: Approaches to low-resource settings
Languages Studied: English
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview