FedLoRA: When Personalized Federated Learning Meets Low-Rank Adaptation

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Personalized Federated Learning, non-IID, Low-Rank Adaptation
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: In this research paper, we introduce a novel approach to Personalized Federated Learning (PFL), which we call FedLoRA. This approach is inspired by recent advancements in fine-tuning Large Language Models (LLMs), particularly the Low-Rank Adaptation (LoRA) technique. The remarkable success of LoRA demonstrates that general linguistic knowledge is preserved in a pre-trained full-rank model, while domain-specific knowledge can be effectively retained within a low-rank parameter matrix. Building upon this insight, we present FedLoRA in the context of PFL, aiming to maintain shared general knowledge among all clients in a common full-rank matrix, while capturing client-specific knowledge within a personalized low-rank matrix. However, the integration of LoRA into PFL presents its own set of challenges. Unlike LoRA, which starts with pre-trained general knowledge, FedLoRA's full-rank matrix needs training from scratch. This phase can be notably influenced by data heterogeneity, potentially hindering its effective extraction of general knowledge. To address this challenge, we propose a new training strategy to mitigate the effects of data heterogeneity on the shared full-rank matrix. Our experimental results, obtained across multiple datasets exhibiting varying degrees of data heterogeneity, demonstrate that FedLoRA outperforms current state-of-the-art methods significantly.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9045
Loading