Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Sentence Ordering using Recurrent Neural Networks
Lajanugen Logeswaran, Honglak Lee, Dragomir Radev
Nov 03, 2016 (modified: Jan 15, 2017)ICLR 2017 conference submissionreaders: everyone
Abstract:Modeling the structure of coherent texts is a task of great importance in NLP. The task of organizing a given set of sentences into a coherent order has been
commonly used to build and evaluate models that understand such structure. In this work we propose an end-to-end neural approach based on the recently proposed
set to sequence mapping framework to address the sentence ordering problem. Our model achieves state-of-the-art performance in the order discrimination task
on two datasets widely used in the literature. We also consider a new interesting task of ordering abstracts from conference papers and research proposals and
demonstrate strong performance against recent methods. Visualizing the sentence representations learned by the model shows that the model has captured high
level logical structure in these paragraphs. The model also learns rich semantic sentence representations by learning to order texts, performing comparably to
recent unsupervised representation learning methods in the sentence similarity and paraphrase detection tasks.
TL;DR:We consider the problem of organizing a given collection of sentences into a coherent order.
Keywords:Natural language processing, Deep learning, Applications
Enter your feedback below and we'll get back to you as soon as possible.