Multilingual Generative Language Models for Zero-Shot Cross-Lingual Event Argument ExtractionDownload PDF

Anonymous

17 Sept 2021 (modified: 05 May 2023)ACL ARR 2021 September Blind SubmissionReaders: Everyone
Abstract: We present a pioneering study on leveraging multilingual pre-trained generative language models for zero-shot cross-lingual event argument extraction (EAE) by formulating EAE as a language generation task. Compared to previous classification-based EAE models that build classifiers on top of pre-trained masked language models, our generative model effectively encodes the event structures and better captures the dependencies between arguments. To achieve cross-lingual transfer, we design language-agnostic templates to encode argument roles and train our models on source languages to "generate" arguments in the source languages to fill in the language-agnostic template. The trained model can then be directly applied to target languages to "generate" arguments in the target languages to fill in the template. Our experimental results demonstrate that the proposed model outperforms the current state-of-the-art results on zero-shot cross-lingual EAE. Comprehensive ablation study and error analysis are presented to better understand the advantages and the current limitations of using multilingual generative language models for cross-lingual transfer.
0 Replies

Loading