What makes multilingual BERT multilingual?Download PDFOpen Website

2020 (modified: 03 Nov 2022)CoRR 2020Readers: Everyone
Abstract: Recently, multilingual BERT works remarkably well on cross-lingual transfer tasks, superior to static non-contextualized word embeddings. In this work, we provide an in-depth experimental study to supplement the existing literature of cross-lingual ability. We compare the cross-lingual ability of non-contextualized and contextualized representation model with the same data. We found that datasize and context window size are crucial factors to the transferability.
0 Replies

Loading