A Token-wise CNN-based Method for Sentence CompressionDownload PDF

23 Sept 2020OpenReview Archive Direct UploadReaders: Everyone
Abstract: Sentence compression is aNatural Language Processing(NLP)task aimed at shortening original sentences and preserving their key in-formation. Its applications can benefit many fieldse.g., one can buildtools for language education. However, current methods are largely basedonRecurrent Neural Network(RNN) models which suffer from poorprocessing speed. To address this issue, in this paper, we propose atoken-wiseConvolutional Neural Network, a CNN-based model alongwith pre-trainedBidirectional Encoder Representations from Transform-ers(BERT) features for deletion-based sentence compression. We alsocompare our model with RNN-based models and fine-tuned BERT. Al-though one of the RNN-based models outperforms marginally other models given the same input, our CNN-based model was ten times faster thanthe RNN-based approach.
0 Replies

Loading