TALENT: Tree-structured Adaptive Learning for Efficient Text-to-SQL Generation

ACL ARR 2025 February Submission5236 Authors

16 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Text-to-SQL systems face increasing challenges in managing complex query generation tasks while maintaining computational efficiency. While recent approaches leverage Large Language Models (LLMs) through chain-based decomposition, they often struggle with error propagation and limited adaptability. To navigate these challenges, we propose TALENT, a hybrid framework that addresses these limitations through tree-structured task planning and reinforcement learning optimization. Our solution contributes two methodological advancements: (1) a flexible tree-based decomposition framework that enables targeted error recovery and reduces inter-task coupling, and (2) a reinforcement learning-enhanced adaptive path optimization mechanism that leverages historical execution patterns to enhance model performance. Our empirical evaluation, conducted on the Spider benchmark demonstrate TALENT's effectiveness, achieving 85.8% execution accuracy with minimal training examples. Through systematic ablation studies and artifact analysis, we further demonstrate the framework's enhanced robustness against dataset biases. These results indicate that structured task orchestration coupled with self-improving optimization can effectively address the demands for accuracy and reliability of text-to-SQL conversion. The complete implementation is available at https://github.com/FIC/TALENT.
Paper Type: Long
Research Area: NLP Applications
Research Area Keywords: Reinforcement Learning, Structured Prediction, Code Generation, Few-shot Learning, Prompting
Contribution Types: NLP engineering experiment, Approaches to low-resource settings, Approaches low compute settings-efficiency
Languages Studied: English, SQL
Submission Number: 5236
Loading