VieSum: How Robust Are Transformer-based Models on Vietnamese Summarization?

Hieu Nguyen, Long Phan, James T. Anibal, Alec Peltekian, Hieu Tran

Published: 2021, Last Modified: 03 Apr 2026CoRR 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Text summarization is a challenging task within natural language processing that involves text generation from lengthy input sequences. While this task has been widely studied in English, there is very limited research on summarization for Vietnamese text. In this paper, we investigate the robustness of transformer-based encoder-decoder architectures for Vietnamese abstractive summarization. Leveraging transfer learning and self-supervised learning, we validate the performance of the methods on two Vietnamese datasets.
Loading