MetaXLR - Mixed Language Meta Representation Transformation for Low-resource Cross-lingual Learning based on Multi-Armed BanditDownload PDF

01 Mar 2023 (modified: 03 Nov 2024)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Abstract: Transfer learning for extremely low-resource languages is a challenging task as there is no large-scale monolingual corpora for pre-training or sufficient annotated data for fine-tuning. We follow the work of (Xia et al., 2021) which suggests using meta learning for transfer learning from a single source language to an extremely low resource one. We propose an enhanced approach which uses multiple source languages chosen in a data-driven manner. In addition, we introduce a sample selection strategy for utilizing the languages in training by using a multi armed bandit algorithm. Using both of these improvements we managed to achieve state-of-the-art results on the NER task for the extremely low resource languages even with the same amount of data, making the representations better generalized. Also, due to the method’s ability to use multiple languages it allows the framework to use much larger amounts of data, while still having superior results over the former MetaXL method even with the same amounts of data.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/metaxlr-mixed-language-meta-representation/code)
8 Replies

Loading