Abstract: Cross-lingual learning, which can transfer knowledge from high-resource languages to low-resource languages, has been widely studied. With the recent rise of large language models (LLMs), in-context learning (ICL) has shown remarkable performance, eliminating the need for fine-tuning parameters and reducing the reliance on extensive labeled data. It sounds tempting to use cross-lingual ICL to solve cross-lingual tasks based on multilingual LLMs. However, the intricacies of cross-lingual ICL remain underexplored. Prior studies on cross-lingual ICL overlooked the significance of language-specific nuances, neglecting not only the intrinsic linguistic properties of sentences but also the interlingual connections between sentences in different languages. In this paper, we propose a novel cross-lingual prompt structure: Language-Emphasized cross-lingual In-context learning LEI. LEI implements language alignment of demonstrations while introducing a third language (example language) as an example of language conversion to adapt LLMs to language conversion in cross-lingual tasks. Extensive experiments validate the state-of-the-art performance of LEI on 42 cross-lingual tasks.
Paper Type: long
Research Area: Multilinguality and Language Diversity
Contribution Types: NLP engineering experiment, Approaches to low-resource settings, Approaches low compute settings-efficiency
Languages Studied: English, German, French, Spanish, Japanese, Mandarin
0 Replies
Loading