XLTime: A Cross-Lingual Knowledge Transfer Framework for Zero-Shot Low-Resource Language Temporal Expression ExtractionDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Temporal Expression Extraction (TEE) is essential for understanding time in natural language. It has applications in Natural Language Processing (NLP) tasks such as question answering, information retrieval, and causal inference. To date, work in this area has mostly focused on English as TEE for low-resource languages is hindered by a scarcity of training data. We propose XLTime, a novel framework for zero-shot low-resource language TEE. XLTime works on top of pre-trained language models and leverages multi-task learning to prompt cross-language knowledge transfer both from English and within the low-resource languages. It alleviates the problems caused by the shortage in low-resource language training data. We apply XLTime with different language models and show that it outperforms the previous automatic SOTA methods on four low-resource languages, i.e., French, Spanish, Portuguese, and Basque, by large margins. It also closes the gap considerably on the handcrafted HeidelTime tool.
0 Replies

Loading