Retrieval-Augmented Parsing for Complex Graphs by Exploiting Structure and Uncertainty

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 FindingsEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: Semantics: Lexical, Sentence level, Document Level, Textual Inference, etc.
Submission Track 2: Syntax, Parsing and their Applications
Keywords: Uncertainty Quantification, Retrieval, Semantic Parsing
TL;DR: We present an effective retrieval augmentation method for parsing complex graph output which utilizes (1) structural similarity and (2) model uncertainty.
Abstract: Retrieval augmentation enhances generative language models by retrieving informative exemplars relevant for output prediction. However, in realistic graph parsing problems where the output space is large and complex, classic retrieval methods based on input-sentence similarity can fail to identify the most informative exemplars that target graph elements the model is most struggling about, leading to suboptimal retrieval and compromised prediction under limited retrieval budget. In this work, we improve retrieval-augmented parsing for complex graph problems by exploiting two unique sources of information (1) structural similarity and (2) model uncertainty. We propose $\textit{\textbf{S}tructure-aware and \textbf{U}ncertainty-\textbf{G}uided \textbf{A}daptive \textbf{R}etrieval} \textbf{(SUGAR)} $ that first quantify the model uncertainty in graph prediction and identify its most uncertain subgraphs, and then retrieve exemplars based on their structural similarity with the identified uncertain subgraphs. On a suite of real-world parsing benchmarks with non-trivial graph structure (SMCalflow and E-commerce), SUGAR exhibits a strong advantage over its classic counterparts that do not leverage structure or model uncertainty.
Submission Number: 609
Loading