The role of diversity in in-context learning for large language models

ACL ARR 2026 January Submission443 Authors

22 Dec 2025 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Data selection, Diversity, In-context Learning
Abstract: In-context learning (ICL) is a crucial capability of current large language models (LLMs), where the selection of examples plays a key role in performance. While most existing approaches focus on selecting the most similar examples to the query, the impact of diversity in example selection remains underexplored. We systematically investigate the role of diversity in in-context example selection through experiments across a range of tasks, from sentiment classification to more challenging math and code problems. Experiments on Llama-3.1, Gemma-2, and Mistral-v0.3 families of models show that diversity-aware selection methods im9 prove performance, particularly on complex tasks like math and code, and enhance robustness to out-of-distribution queries. To support these findings, we introduce a theoretical framework that explains the benefits of incorporating diversity in in-context example selection.
Paper Type: Long
Research Area: Language Models
Research Area Keywords: Generation, Language Modeling
Contribution Types: Model analysis & interpretability, Reproduction study, Data analysis, Surveys, Theory
Languages Studied: English
Submission Number: 443
Loading