Understanding LoRA Update Complexity Through Stable Rank
Keywords: Parameter-Efficient Fine-Tuning, Low-Rank Adaptation (LoRA), Stable Rank, Update Complexity, Optimization Dynamics, Large Language Models
Submission Number: 225
Loading
OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2026 OpenReview