In-context Learning with Retrieved Demonstrations for Language Models: A Survey

Published: 11 Oct 2024, Last Modified: 11 Oct 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Large language models have demonstrated remarkable few-shot in-context learning (ICL) capabilities, adapting to new tasks with few-shots demonstrations. However, the efficacy of ICL is highly dependent on the selection of these demonstrations. Recent developments have introduced retrieval-based in-context learning (RetICL), which dynamically retrieves demonstrations tailored to each input query. This approach leverages existing databases and retrieval systems, enhancing efficiency and scalability while mitigating biases inherent in manual example selection. Given the promising results and growing interest in RetICL, we present a comprehensive survey of this field. Our review encompasses: design choices for ICL demonstration retrieval models, retrieval training procedures, inference strategies and current applications of RetICL. In the end, we explore future directions for this emerging technology.
Certifications: Survey Certification
Submission Length: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=NQPo8ZhQPa
Changes Since Last Submission: We have carefully considered each reviewer's suggestions on adding citations, clarify ambiguous and adding analysis.
Code: https://github.com/luomancs/luomancs-reticl_llm_survey/
Assigned Action Editor: ~Jake_Snell1
Submission Number: 2428
Loading