Language Representation in Multilingual BERT and its applications to improve Cross-lingual GeneralizationDownload PDFOpen Website

2020 (modified: 03 Nov 2022)CoRR 2020Readers: Everyone
Abstract: Token embeddings in multilingual BERT (m-BERT) contain both language and semantic information. We find that the representation of a language can be obtained by simply averaging the embeddings of the tokens of the language. Given this language representation, we control the output languages of multilingual BERT by manipulating the token embeddings, thus achieving unsupervised token translation. We further propose a computationally cheap but effective approach to improve the cross-lingual ability of m-BERT based on this observation.
0 Replies

Loading