Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Sequence-to-Sequence RNNs for Text Summarization
Ramesh Nallapati, Bing Xiang, Bowen Zhou
Feb 18, 2016 (modified: Feb 18, 2016)ICLR 2016 workshop submissionreaders: everyone
Abstract:In this work, we cast text summarization as a sequence-to-sequence problem and apply the attentional encoder-decoder RNN that has been shown to be successful for Machine Translation.
Our experiments show that the proposed architecture significantly outperforms the state-of-the art model of Rush et. al. (2015), on the Gigaword dataset without any additional tuning. We also propose additional extensions to the standard architecture, which we show contribute to further improvement in performance.
Enter your feedback below and we'll get back to you as soon as possible.