Hire a Linguist!: Learning Endangered Languages with In-Context Linguistic DescriptionsDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: How can large language models (LLMs) process and translate endangered languages?Many languages lack a large corpus to train a decent LLM; therefore existing LLMs rarely perform well in unseen, endangered languages.On the contrary, we observe that 2000 endangered languages, though without a large corpus, have a grammar book or a dictionary. We propose \method, a training-free approach to enable an LLM to process unseen languages that hardly occur in its pre-training.Our key insight is to demonstrate linguistic knowledge of an unseen language in an LLM's prompt, including a dictionary, a grammar book, and morphologically analyzed input text.We implement \method on top of two models, GPT-4 and Mixtral, and evaluate their performance on 5 tasks across 8 endangered or low-resource languages. Our results show that \method elevates translation capability from GPT-4's 0 to 10.5 BLEU for 10 language directions.Our findings demonstrate the tremendous value of linguistic knowledge in the age of LLMs for endangered languages.Our data, code, and model generations will be released to the public.
Paper Type: long
Research Area: NLP Applications
Contribution Types: NLP engineering experiment, Approaches to low-resource settings
Languages Studied: Manchu, Gitksan, Uspanteko, Natugu, Arapaho, Tsez, Wolof, Bribri
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview