Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Neural Text Understanding with Attention Sum Reader
Rudolf Kadlec, Martin Schmid, Ondřej Bajgar, Jan Kleindienst
Feb 18, 2016 (modified: Feb 18, 2016)ICLR 2016 workshop submissionreaders: everyone
Abstract:Two large-scale cloze-style context-question-answer datasets have been introduced recently: i) the CNN and Daily Mail news data and ii) the Children's Book Test.
Thanks to the size of these datasets, the associated task is well suited for deep-learning techniques that seem to outperform all alternative approaches.
We present a new, simple model that is tailor made for such question-answering problems.
Our model directly sums attention over candidate answer words in the document instead of using it to compute weighted sum of word embeddings.
Our model outperforms models previously proposed for these tasks by a large margin.
Enter your feedback below and we'll get back to you as soon as possible.