LangBridge: Multilingual Reasoning Without Multilingual Supervision

ICLR 2024 Workshop ME-FoMo Submission45 Authors

Published: 04 Mar 2024, Last Modified: 29 Apr 2024ME-FoMo 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multilingualism, Reasoning, Language Models
TL;DR: LangBridge, a zero-shot approach to adapt language models for multilingual reasoning tasks without multilingual supervision.
Abstract: We introduce LangBridge, a $\textit{zero-shot}$ approach to adapt language models for multilingual reasoning tasks without multilingual supervision. LangBridge operates by bridging two models, each specialized in different aspects: (1) one specialized in understanding multiple languages (e.g., mT5 encoder) and (2) one specialized in reasoning (e.g., MetaMath). LangBridge connects the two models by introducing minimal trainable parameters between them. Despite utilizing only English data for training, LangBridge considerably enhances the performance of language models on low-resource languages across mathematical reasoning, coding, and logical reasoning. Our analysis suggests that the efficacy of LangBridge stems from the language-agnostic characteristics of multilingual representations. We publicly release our code and models.
Submission Number: 45
Loading