Averaging is Not Enough: Preserving Client-Specific Knowledge in Federated PEFT with One-Round Aggregation
Abstract: Federated Learning (FL) provides a privacy-preserving framework for fine-tuning Pre-trained Language Models (PLMs) on decentralized data. To reduce the computational and communication costs arising from the massive parameters of PLMs, parameter-efficient fine-tuning (PEFT) techniques have been widely adopted. However, integrating PEFT into FL remains challenging, especially under non-IID settings, where significant performance degradation is commonly observed. In this work, we identify the root cause of this degradation as a fundamental incompatibility between PEFT methods and the aggregation mechanism in FL. Specifically, conventional averaging fails to effectively preserve the personalized knowledge encoded in each client’s PEFT updates, resulting in suboptimal performance and slower convergence. To address this issue, we propose an expert-guided aggregation strategy designed to better retain client-specific information. We instantiate this strategy with FedELoRA, a novel LoRA-based framework for FL that requires only a single round of communication. FedELoRA treats each client’s locally trained LoRA adapter as an expert and employs a trainable gating network to dynamically combine them after local training. This enables effective integration of heterogeneous client knowledge while significantly reducing communication overhead. Extensive experiments across diverse domains demonstrate that FedELoRA consistently outperforms state-of-the-art baselines under both IID and non-IID settings, while using only 15.4\% of the communication cost of the most efficient prior method. Our code is available at https://anonymous.4open.science/r/FedELoRA-30C0.
Paper Type: Long
Research Area: Efficient/Low-Resource Methods for NLP
Research Area Keywords: parameter-efficient-training, LLM Efficiency
Contribution Types: Model analysis & interpretability, Approaches to low-resource settings, Approaches low compute settings-efficiency
Languages Studied: English
Submission Number: 1282
Loading