Adapting to the Low-Resource Double-Bind: Investigating Low-Compute Methods on Low-Resource African LanguagesDownload PDF

Published: 03 Mar 2023, Last Modified: 15 Apr 2023AfricaNLP 2023Readers: Everyone
Keywords: NER
TL;DR: Evaluating NLP methods under constraint of both low-data and low-compute, on African languages.
Abstract: Many natural language processing (NLP) tasks make use of massively pretrained language models, which are computationally expensive. However, access to high computational resources added to the issue of data scarcity of African languages constitutes a real barrier to research experiments on these languages. In this work, we explore the applicability of low-compute approaches such as language adapters in the context of this low-resource double-bind. We intend to answer the following question: do language adapters allow those who are doubly bound by data and compute, to practically build useful models? Through fine-tuning experiments on African languages, we evaluate their effectiveness as cost-effective approaches to low-resource African NLP. Using solely free compute resources, our results show that language adapters achieve comparable performances to massive pretrained language models which are heavy on computational resources. This opens the door to further experimentation and exploration on full-extent of language adapters capacities.
0 Replies

Loading