Semantics or Syntax? Which is More Important in In-Context Learning for Sentence ClassificationDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: In this study, we explore the impact of semantics and syntax in the construction of demonstration examples for in-context learning (ICL) with Large Language Models (LLMs). We identify the limitations of current methods that prioritize semantic similarity and underscore the importance of syntactic information, which has been underrepresented in sentence-level classification tasks. Through experiments measuring semantic and syntactic similarities, we reveal that ICL methods tend to favor syntactic congruence. Consequently, we propose a novel Semantics and Syntax-based Sentence Selection (SSSS) framework for selecting demonstration examples in ICL, integrating both semantic and syntactic dimensions. This approach addresses the challenges of constructing accurate semantic representations and quantifying syntactic structure similarities. The experimental results on three datasets suggest that the SSSS approach can facilitate more effective ICL by incorporating syntax into the demonstrative example selection, potentially leading to enhanced model performance.
Paper Type: long
Research Area: Information Extraction
Contribution Types: NLP engineering experiment
Languages Studied: English
0 Replies

Loading