Sequence-to-Sequence RNNs for Text SummarizationDownload PDF

28 Mar 2024 (modified: 18 Feb 2016)ICLR 2016 workshop submissionReaders: Everyone
Abstract: In this work, we cast text summarization as a sequence-to-sequence problem and apply the attentional encoder-decoder RNN that has been shown to be successful for Machine Translation. Our experiments show that the proposed architecture significantly outperforms the state-of-the art model of Rush et. al. (2015), on the Gigaword dataset without any additional tuning. We also propose additional extensions to the standard architecture, which we show contribute to further improvement in performance.
Conflicts: us.ibm.com
3 Replies

Loading