Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual TransferDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Adapter modules have recently been used for efficient fine-tuning and language specialization of massively multilingual Transformers (MMTs), improving downstream zero-shot cross-lingual transfer. In this work, we propose orthogonal language and task adapters (dubbed orthoadapters) for cross-lingual transfer. They are trained to encode language- and task-specific information that is complementary (i.e., orthogonal) to the knowledge already stored in the pretrained MMT parameters. Our zero-shot transfer experiments, involving three tasks and 10 diverse languages, 1) point to the usefulness of orthoadapters in cross-lingual transfer, especially for the most complex NLI task, but also 2) indicate that the optimal (ortho)adapter configuration highly depends on the task and the target language at hand. We hope that our work will motivate a wider investigation of usefulness of orthogonality constraints in language- and task-specific fine-tuning of pretrained transformers.
Paper Type: short
0 Replies

Loading