Correlation-Aware Example Selection for In-Context Learning with Nonsymmetric Determinantal Point Processes

ACL ARR 2025 February Submission8460 Authors

16 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: LLMs with in-context learning (ICL) obtain remarkable performance but are sensitive to the quality of ICL examples. Prior work on ICL example selection explored unsupervised heuristic methods and supervised LLM feedback-based methods, but they typically focus on the selection of individual examples, ignore correlations among examples. Recent researchers propose to use the determinantal point process (DPP) to model negative correlations among examples to select diverse example sets. However, the DPP fails to model positive correlations among examples, but ICL still requires the positive correlations of examples to ensure the consistency of its examples that provide a clear instruction for LLMs. In this paper, we propose an ICL example selection framework based on the nonsymmetric determinantal point process (NDPP) to capture positive and negative correlations, consider both the diversity and the relevance among ICL examples. Specifically, we optimize NDPP via kernel decomposition-based MLE to fit a constructed pseudo-labeled dataset, where we also propose low-rank decomposition to reduce the computational cost. Further, we perform query-aware kernel adaptation on our NDPP to customize the input query, and we select examples via a maximal-a-posteriori inference based on the adapted NDPP. Experiments show our model excels strong baselines in ICL example selection.
Paper Type: Long
Research Area: Syntax: Tagging, Chunking and Parsing
Research Area Keywords: prompting
Contribution Types: Model analysis & interpretability, NLP engineering experiment
Languages Studied: english
Submission Number: 8460
Loading