Enhancing Cross-lingual Prompting with Two-level AugmentationDownload PDF

Anonymous

17 Dec 2021 (modified: 05 May 2023)ACL ARR 2021 December Blind SubmissionReaders: Everyone
Abstract: Prompting approaches show promising results in few-shot scenarios. However, its strength for multilingual/cross-lingual problems has not been fully exploited. Zhao and Schütze (2021) made initial explorations in this direction by presenting that cross-lingual prompting outperforms cross-lingual finetuning. In this paper, we first conduct sensitivity analysis on the effect of each component in cross-lingual prompting and derive Universal Prompting across languages. Based on this, we propose a two-level augmentation framework to further improve the performance of prompt-based cross-lingual transfer. Notably, for XNLI, our method achieves 46.54% with only 16 English training examples per class, significantly better than 34.99% of finetuning.
Paper Type: short
Consent To Share Data: yes
0 Replies

Loading