Zero-shot Cross-lingual Transfer is Under-specified OptimizationDownload PDF

Anonymous

16 Oct 2021 (modified: 05 May 2023)ACL ARR 2021 October Blind SubmissionReaders: Everyone
Abstract: Pretrained multilingual encoders enable zero-shot cross-lingual transfer performance, but often produce unreliable models that exhibit high performance variance on the target language. We postulate that high variance results from zero-shot cross-lingual transfer solving an under-specified optimization problem. We show that the source language monolingual model and source + target bilingual model are linearly connected using a model interpolation, suggesting that the model struggles to identify good solutions for both source and target languages using the source language alone.
0 Replies

Loading