Question Decomposition with Dependency GraphsDownload PDF

Published: 31 Aug 2021, Last Modified: 22 Oct 2023AKBC 2021Readers: Everyone
Keywords: NLP, QA, dependency parsing, multitask, transformers, QDMR
TL;DR: Combining seq2seq-based and graph-based approaches to improve sample complexity and domain generalization in question decomposing.
Abstract: QDMR is a meaning representation for complex questions, which decomposes questions into a sequence of atomic steps, and has been recently shown to be useful for question answering. While state-of-the-art QDMR parsers use the common sequence-to-sequence (seq2seq) approach, a QDMR structure fundamentally describes labeled relations between spans in the input question, and thus dependency-based approaches seem appropriate for this task. In this work, we present a QDMR parser that is based on dependency graphs (DGs), where nodes in the graph are words and edges describe logical relations that correspond to the different computation steps. We propose (a) a non-autoregressive graph parser, where all graph edges are computed simultaneously, and (b) a seq2seq parser that uses the gold graph as auxiliary supervision. We find that a graph parser leads to a moderate reduction in performance (0.47 to 0.44), but to a 16x speed-up in inference time due to its non-autoregressive nature, and to improved sample complexity compared to a seq2seq model. Second, training a seq2seq model with auxiliary DG supervision leads to better generalization on out-of-domain data and on QDMR structures with long sequences of computation steps.
Subject Areas: Question Answering and Reasoning, Machine Learning
Archival Status: Non-Archival
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2104.08647/code)
8 Replies

Loading