Towards Efficient Federated Multilingual Modeling with LoRA-based Language Family ClusteringDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: We introduce a communication-efficient federated learning framework with low-rank adaptation and language family clustering for Multilingual Modeling.
Abstract: Federated Multilingual Modeling (FMM) plays a crucial role in the applications of natural language processing due to the increasing diversity of languages and the growing demand for data privacy. However, FMM faces limitations stemming from the substantial communication costs in networking and the conflicts arising from parameter interference between different languages. To address these challenges, we introduce a communication-efficient federated learning framework with low-rank adaptation and language family clustering for Multilingual Modeling (MM). In this framework, we maintain the weights of the base model, exclusively updating the lightweight Low-rank adaptation (LoRA) parameters to minimize communication costs. Additionally, we mitigate parameter conflicts by grouping languages based on their language family affiliations, as opposed to aggregating all LoRA parameters. Experiments demonstrate that our proposed model not only surpasses the baseline models in performance but also reduces the communication overhead.
Paper Type: short
Research Area: NLP Applications
Contribution Types: NLP engineering experiment, Approaches to low-resource settings, Approaches low compute settings-efficiency
Languages Studied: English, German, Spanish, French, Portuguese, Russia, Polish, Czech, Lithuanian, Chinese, Finnish, Arabic, Japanese
0 Replies

Loading