MPLoRA: Orthogonal Multi-Path Low-Rank Adaptation for Parameter Efficient Fine-Tuning

Published: 10 Oct 2024, Last Modified: 02 Nov 2024FITML 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: parameter efficient fine-tuning, orthogonal constraints, low-rank adaptation
TL;DR: We enhance LoRA by incorporating multiple orthogonal low-rank paths, improving representation diversity and performance without increasing parameters.
Abstract: Parameter-efficient fine-tuning (PEFT) has become crucial for adapting large language models to specific tasks, with Low-Rank Adaptation (LoRA) emerging as a prominent method. However, capturing diverse representations within LoRA's limited parameter space remains challenging. We propose Multi-Path LoRA (MPLoRA), a novel approach that decomposes the adaptation matrix into multiple smaller matrices with orthogonal constraints. MPLoRA encourages diverse representations and improves adaptation capability without increasing parameter count. Experiments on various tasks demonstrate that MPLoRA outperforms LoRA and other baselines, with notable improvements on datasets with limited samples. Our analysis reveals that both the multi-path structure and orthogonal constraints contribute significantly to MPLoRA's effectiveness. These findings highlight MPLoRA's potential for enhancing LLM performance and generalization, especially in resource-constrained scenarios, offering new insights into parameter-efficient fine-tuning.
Submission Number: 47
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview