Pretrained Language Models Are All You Need For Text-to-SQL Schema LinkingDownload PDF

Anonymous

17 Dec 2021 (modified: 05 May 2023)ACL ARR 2021 December Blind SubmissionReaders: Everyone
Abstract: The use of Exact Match based Schema Linking (EMSL) has become standard in text-to-SQL: many state-of-the-art text-to-SQL models employ EMSL, and their performance drops significantly when the EMSL component is removed. In this work, however, we demonstrate that EMSL reduces robustness, rendering models vulnerable to synonym substitution and typos. Instead of relying on EMSL to make up for deficiencies in question-schema encoding, we show that by utilizing the pre-trained language model as the encoder, we can improve the performance without using EMSL, and thus the model is more robust. Our experiments suggest that EMSL is not the icing on the cake, but it is the one that introduces the vulnerability, and it can be replaced by better input encoding.
Paper Type: long
Consent To Share Data: yes
0 Replies

Loading