Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
A Context-aware Attention Network for Interactive Question Answering
Huayu Li, Martin Renqiang Min, Yong Ge, Asim Kadav
Nov 04, 2016 (modified: Dec 05, 2016)ICLR 2017 conference submissionreaders: everyone
Abstract:We develop a new model for Interactive Question Answering (IQA), using Gated-Recurrent-Unit recurrent networks (GRUs) as encoders for statements and questions, and another GRU as a decoder for outputs. Distinct from previous work, our approach employs context-dependent word-level attention for more accurate statement representations and question-guided sentence-level attention for better context modeling. Employing these mechanisms, our model accurately understands when it can output an answer or when it requires generating a supplementary question for additional input. When available, user's feedback is encoded and directly applied to update sentence-level attention to infer the answer. Extensive experiments on QA and IQA datasets demonstrate quantitatively the effectiveness of our model with significant improvement over conventional QA models.
TL;DR:A self-adaptive QA model aware of what it knows and what it does not know for interactive question answering.
Keywords:Deep learning, Natural language processing, Applications