Abstract: Few-shot learning via in-context learning (ICL) is widely used in NLP, but its effectiveness is highly sensitive to example selection, leading to performance variance. To address this, we introduce BACKGEN, a framework to generate structured Background Knowledge (BK) as an alternative to example-based prompting. Our approach leverages Frame Semantics to identify recurring conceptual patterns in a dataset, clustering similar instances based on shared event structures and semantic roles.
Using an LLM, we synthesize these patterns into generalized knowledge statements, which are then incorporated into prompts to enhance contextual reasoning beyond individual sentence interpretations. We apply BACKGEN to Sentiment Phrase Classification (SPC), where sentiment polarity often depends on implicit commonsense knowledge. Experimental results with Mistral-7B and Llama3-8B show that BK-based prompting consistently outperforms standard few-shot approaches, yielding up to 29.94% error reduction
Paper Type: Long
Research Area: Sentiment Analysis, Stylistic Analysis, and Argument Mining
Research Area Keywords: few-shot learning, semantic textual similarity, frame detection and analysis
Contribution Types: NLP engineering experiment, Data resources
Languages Studied: English
Submission Number: 1410
Loading