FlowQA: Grasping Flow in History for Conversational Machine ComprehensionDownload PDF

Published: 21 Dec 2018, Last Modified: 05 May 2023ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Conversational machine comprehension requires a deep understanding of the conversation history. To enable traditional, single-turn models to encode the history comprehensively, we introduce Flow, a mechanism that can incorporate intermediate representations generated during the process of answering previous questions, through an alternating parallel processing structure. Compared to shallow approaches that concatenate previous questions/answers as input, Flow integrates the latent semantics of the conversation history more deeply. Our model, FlowQA, shows superior performance on two recently proposed conversational challenges (+7.2% F1 on CoQA and +4.0% on QuAC). The effectiveness of Flow also shows in other tasks. By reducing sequential instruction understanding to conversational machine comprehension, FlowQA outperforms the best models on all three domains in SCONE, with +1.8% to +4.4% improvement in accuracy.
Keywords: Machine Comprehension, Conversational Agent, Natural Language Processing, Deep Learning
TL;DR: We propose the Flow mechanism and an end-to-end architecture, FlowQA, that achieves SotA on two conversational QA datasets and a sequential instruction understanding task.
Code: [![github](/images/github_icon.svg) momohuang/FlowQA](https://github.com/momohuang/FlowQA)
Data: [CoQA](https://paperswithcode.com/dataset/coqa), [QuAC](https://paperswithcode.com/dataset/quac)
12 Replies

Loading