FedHLT: Efficient Federated Low-Rank Adaption with Hierarchical Language Tree for Multilingual Modeling
Abstract: Federated Multilingual Modeling (FMM) has become an essential approach in natural language processing (NLP) due to increasing linguistic diversity and the heightened emphasis on data privacy. However, FMM faces two primary challenges: 1) the high communication costs inherent in network operations, and 2) the complexities arising from parameter interference, as languages exhibit both unique characteristics and shared features. To tackle these issues, we introduce a communication-efficient framework for Multilingual Modeling (MM) that combines low-rank adaptation with a hierarchical language tree structure. Our method maintains the base model's weights while focusing on updating only the Low-rank adaptation (LoRA) parameters, significantly reducing communication costs. Additionally, we mitigate parameter conflicts by organizing languages based on their familial ties rather than merging all LoRA parameters together. Our experimental findings reveal that this novel model surpasses established baseline models in performance and markedly decreases communication overhead.
Loading