Spectral Adapter: Fine-Tuning in Spectral Space

Published: 25 Sept 2024, Last Modified: 06 Nov 2024NeurIPS 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: fine-tuning; peft; spectral decomposition; transfer learning in spectral space
TL;DR: In this work, we enhance current PEFT methods by incorporating spectral information of pretrained weights into the fine-tuning procedure.
Abstract: Recent developments in Parameter-Efficient Fine-Tuning (PEFT) methods for pretrained deep neural networks have captured widespread interest. In this work, we study the enhancement of current PEFT methods by incorporating the spectral information of pretrained weight matrices into the fine-tuning procedure. We investigate two spectral adaptation mechanisms, namely additive tuning and orthogonal rotation of the top singular vectors, both are done via first carrying out Singular Value Decomposition (SVD) of pretrained weights and then fine-tuning the top spectral space. We provide a theoretical analysis of spectral fine-tuning and show that our approach improves the rank capacity of low-rank adapters given a fixed trainable parameter budget. We show through extensive experiments that the proposed fine-tuning model enables better parameter efficiency and tuning performance as well as benefits multi-adapter fusion. The source code will be open-sourced for reproducibility.
Supplementary Material: zip
Primary Area: Deep learning architectures
Submission Number: 14643
Loading