Track: Technical
Keywords: Multi-Culture LLMs, Inclusive LLMs, In-Context Learning
TL;DR: We explore the use of in-context learning with diverse demonstrations to enhance Large Language Models' cultural understanding
Abstract: Large Language Models (LLMs) have shown proficiency in various tasks but often struggle to capture cultural knowledge, especially for underrepresented regions. To adapt LLMs to diverse cultures, we explore the power of in-context learning (ICL), where models can leverage contextual demonstrations. Specifically, we investigate the effect of the same, different (i.e., cross-cultural), or flawed in-context demonstrations on a cultural question-answering task across 16 cultures. Our findings show that demonstrations from the same culture generally enhance performance, and cross-cultural demonstrations sometimes outperform those from the same culture. However, incorrect cross-culture demonstrations can substantially decrease performance. These results suggest that knowledge of well-known cultures can potentially enhance the models' understanding of marginalized ones. We leave how to choose which culture's demonstrations for future work to reflect better the diversity of cultures within LLMs.
Submission Number: 98
Loading