Adaptive Cross-lingual Text Classification through In-Context One-Shot DemonstrationsDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
Abstract: Zero-shot cross-lingual Transfer (ZS-XLT) utilizes a model trained in a source language to make predictions in a target language. However, this method often yields performance loss in the target language. To alleviate this loss, additional improvements can be achieved through subsequent fine-tuning using target demonstrations. In this paper, we exploit In-Context Tuning (ICT) for One-Shot cross-lingual transfer in the classification task by introducing In-Context Cross-lingual transfer (IC-XLT). The novel concept involves training a model to learn from context examples and subsequently adapting it at inference to a target language using One-Shot context demonstrations target language. Remarkably, this adaptation process requires no fine-tuning for reducing the performance gap with the source language. Our results show that IC-XLT successfully leverages these demonstrations to improve the cross-lingual capabilities of the evaluated mT5 model, outperforming prompt-based fine-tuned models in the Zero and One-shot scenarios. Moreover, we show that when source language data is limited, the fine-tuning framework employed for IC-XLT performs comparably to Prompt-based fine-tuning with significantly more training data in the source language. Hence, we also present a compelling alternative for One-Shot cross-lingual transfer in scenarios where computational resources or source-language data is constrained.
Paper Type: long
Research Area: Multilinguality and Language Diversity
Contribution Types: Approaches to low-resource settings, Approaches low compute settings-efficiency
Languages Studied: English, Spanish, French, Russian, Thai, Dutch, Turkish
0 Replies

Loading