$R3$-NL2GQL: A Model Coordination and Knowledge Graph Alignment Approach for NL2GQL

ACL ARR 2024 June Submission4917 Authors

16 Jun 2024 (modified: 09 Aug 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: While current tasks of converting natural language to SQL (NL2SQL) using Foundation Models have shown impressive achievements, adapting these approaches for converting natural language to Graph Query Language (NL2GQL) encounters hurdles due to the distinct nature of GQL compared to SQL, alongside the diverse forms of GQL. Moving away from traditional rule-based and slot-filling methodologies, we introduce a novel approach, $R^3$-NL2GQL, integrating both small and large Foundation Models for ranking, rewriting, and refining tasks. This method leverages the interpretative strengths of smaller models for initial ranking and rewriting stages, while capitalizing on the superior generalization and query generation prowess of larger models for the final transformation of natural language queries into GQL formats. Addressing the scarcity of datasets in this emerging field, we have developed a bilingual dataset, sourced from graph database manuals and selected open-source Knowledge Graphs (KGs). Our evaluation of this methodology on this dataset demonstrates its promising efficacy and robustness.
Paper Type: Long
Research Area: Information Retrieval and Text Mining
Research Area Keywords: NL2GQL, Knowledge Graph Retrieval, Graph DataBase
Contribution Types: NLP engineering experiment, Data resources, Data analysis
Languages Studied: English, Chinese
Submission Number: 4917
Loading