A Balanced Data Approach for Evaluating Cross-Lingual Transfer: Mapping the Linguistic Blood BankDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=Ynu6z5iWxWJ
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: We show that the choice of pretraining languages affects downstream cross-lingual transfer for BERT-based models. We inspect zero-shot performance in balanced data conditions to mitigate data size confounds, classifying pretraining languages that improve downstream performance as donors, and languages that are improved in zero-shot performance as recipients. We develop a method of quadratic time complexity in the number of languages to estimate these relations, instead of an exponential exhaustive computation of all possible combinations. We find that our method is effective on a diverse set of languages spanning different linguistic features and two downstream tasks. Our findings can inform developers of large-scale multilingual language models in choosing better pretraining configurations.
Presentation Mode: This paper will be presented in person in Seattle
Virtual Presentation Timezone: UTC+3
Copyright Consent Signature (type Name Or NA If Not Transferrable): Gabriel Stanovsky
Copyright Consent Name And Address: School of Computer Science, The Hebrew University of Jerusalem
0 Replies

Loading