A Zero-Resource Approach to Cross-Lingual Query-Focused Abstractive SummarizationDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: We present a novel approach for cross-lingual query-focused abstractive summarization (QFAS) that leverages the translate-then-summarize paradigm. We approach cross-lingual QFAS as a zero-resource problem and introduce a framework to create a synthetic QFAS corpus from a standard summarization corpus using a novel query-generation strategy. Our model summarizes documents in foreign languages for which translation quality is poor. It learns not only to identify and condense salient information relevant to a query, but also to appropriately rephrase grammatical errors and disfluencies that may occur in the noisy translations. Our technique enhances a pre-trained encoder-decoder transformer by introducing query focus to the encoder. We show that our method for creating synthetic QFAS data leads to more robust models that not only achieve state-of-the-art performance on our corpus, but also perform better on out-of-distribution data as compared to prior work.
0 Replies

Loading