Ungrammatical-syntax-based In-context Example Selection for Grammatical Error CorrectionDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: We propose ungrammatical-syntax-based in-context example selection approaches for large language models' in-context learning on grammatical error correction.
Abstract: In the era of large language models (LLMs), in-context learning (ICL) stands out as an effective prompting strategy that explores LLMs' potency across various tasks. However, applying LLMs to grammatical error correction (GEC) is still a challenging task. In this paper, we propose a novel ungrammatical-syntax-based in-context example selection strategy for GEC. Specifically, we measure similarity of texts based on their syntactic structure with diverse algorithms, and identify optimal ICL examples sharing the most similar ill-formed syntax to the test sample. Additionally, we carry out a two-stage process to further improve the quality of selection results. On benchmark English GEC datasets, empirical results show that our proposed ungrammatical-syntax-based strategies outperform commonly-used word-matching methods with multiple LLMs. This indicates that for a syntax-oriented task like GEC, paying more attention to syntactic information can effectively boost LLMs' performance. Our code will be publicly available after the publication of this paper.
Paper Type: long
Research Area: NLP Applications
Contribution Types: Approaches to low-resource settings
Languages Studied: English
0 Replies

Loading