Neural Text Understanding with Attention Sum ReaderDownload PDF

23 Apr 2024 (modified: 18 Feb 2016)ICLR 2016 workshop submissionReaders: Everyone
Abstract: Two large-scale cloze-style context-question-answer datasets have been introduced recently: i) the CNN and Daily Mail news data and ii) the Children's Book Test. Thanks to the size of these datasets, the associated task is well suited for deep-learning techniques that seem to outperform all alternative approaches. We present a new, simple model that is tailor made for such question-answering problems. Our model directly sums attention over candidate answer words in the document instead of using it to compute weighted sum of word embeddings. Our model outperforms models previously proposed for these tasks by a large margin.
Conflicts: ibm.com
3 Replies

Loading