Weakly Supervised Text-to-SQL Parsing through Question DecompositionDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Text-to-SQL parsers are crucial in enabling non-experts to effortlessly query relational data. Training such parsers, by contrast, generally requires expert annotation of natural language (NL) utterances paired with corresponding SQL queries.In this work, we propose a weak supervision approach for training text-to-SQL parsers. We take advantage of the recently proposed question meaning representation called QDMR, an intermediate between NL and formal query languages.We show that given questions, their QDMR structures (annotated by non-experts or automatically predicted), and the answers, we can automatically synthesize SQL queries that are then used to train text-to-SQL models. Extensive experiments test our approach on five benchmark datasets. The results show that our models perform competitively with those trained on annotated NL-SQL data.Overall, we effectively train text-to-SQL parsers, using zero SQL annotations.
Paper Type: long
0 Replies

Loading