Keywords: natural language processing, machine translation, low-resource languages, linguistic diversity, in-context learning, prompt engineering
Abstract: Existing large language models still fail to support many low-resource languages. Especially for the extremely low-resource ones, there is hardly any training data to effectively update the model parameters. We thus investigate whether LLMs can learn a new language on the fly through in-context learning prompting. To study this question, we collect a tiny parallel corpus for Zhuang, a language supported by no LLMs currently. We study the performance of various LLMs on the Zhuang-Chinese translation task and find out the great potential of this learning paradigm.
Submission Number: 168
Loading