Paper Link: https://openreview.net/forum?id=T4mIFZTlEF
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: Text-to-SQL parsers are crucial in enabling non-experts to effortlessly query relational data. Training such parsers, by contrast, generally requires expertise in annotating natural language (NL) utterances with corresponding SQL queries.
In this work, we propose a weak supervision approach for training text-to-SQL parsers. We take advantage of the recently proposed question meaning representation called QDMR, an intermediate between NL and formal query languages.
Given questions, their QDMR structures (annotated by non-experts or automatically predicted), and the answers, we are able to automatically synthesize SQL queries that are used to train text-to-SQL models. We test our approach by experimenting on five benchmark datasets. Our results show that the weakly supervised models perform competitively with those trained on annotated NL-SQL data.
Overall, we effectively train text-to-SQL parsers, while using zero SQL annotations.
Copyright Consent Signature (type Name Or NA If Not Transferrable): Tomer Wolfson
Copyright Consent Name And Address: Tel Aviv University, Chaim Levanon St, Tel Aviv, Israel
Presentation Mode: This paper will be presented in person in Seattle
0 Replies
Loading