On the Choice of Auxiliary Languages for Improved Sequence TaggingDownload PDFOpen Website

2020 (modified: 14 Dec 2023)RepL4NLP@ACL 2020Readers: Everyone
Abstract: Recent work showed that embeddings from related languages can improve the performance of sequence tagging, even for monolingual models. In this analysis paper, we investigate whether the best auxiliary language can be predicted based on language distances and show that the most related language is not always the best auxiliary language. Further, we show that attention-based meta-embeddings can effectively combine pre-trained embeddings from different languages for sequence tagging and set new state-of-the-art results for part-of-speech tagging in five languages.
0 Replies

Loading