Enhancing Cross-lingual Natural Language Inference by Soft Prompting with Language-independent KnowledgeDownload PDF

Anonymous

16 Oct 2022 (modified: 05 May 2023)ACL ARR 2022 October Blind SubmissionReaders: Everyone
Keywords: prompt learning, cross-lingual natural language inference
Abstract: Cross-lingual natural language inference is a fundamental problem in cross-lingual language understanding. Many recent works have used prompt learning to address the lack of annotated parallel corpora in XNLI. However, these methods adopt discrete prompting by simply translating the template to the target language and can't transfer knowledge from high-resource to low-resource languages. In this paper, we propose a novel \textbf{Soft} prompt learning framework enhanced by \textbf{L}anguage-\textbf{In}dependent \textbf{K}nowledge (SoftLINK) for XNLI. SoftLINK leverages bilingual dictionaries to generate an augmented multilingual sample for input texts. Then our model constructs cloze-style questions with soft prompts for the original and augmented samples. SoftLINK also adopts a multilingual verbalizer to align the representations of original and augmented multilingual questions on the semantic space with consistency regularization. Experimental results on XNLI demonstrate that SoftLINK can achieve state-of-the-art performance and significantly outperform the previous methods under the few-shot and full-shot cross-lingual transfer settings.
Paper Type: long
Research Area: Semantics: Sentence-level Semantics, Textual Inference and Other areas
0 Replies

Loading