QuIC Adapters: Quantum-Inspired Compound Adapters for Parameter-Efficient Fine-Tuning

04 May 2026 (modified: 09 May 2026)ICML 2026 Workshop CoLoRAI SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: peft, lora, quic, compound, quantum, orthogonal, adapters, finetuning, Llama, GLUE, VTAB
TL;DR: Quantum inspired orthogonal fine-tuning via $k$-th compound matrices: giving a tunable Pareto knob between parameter count and expressiveness.
Abstract: Low-rank adaptation (LoRA) is the dominant approach to parameter-efficient fine-tuning, but its additive update perturbs the multiplicative geometry of pretrained representations. Orthogonal fine-tuning (OFT, BOFT) preserves this geometry and achieves the best PEFT accuracy on language and vision benchmarks, but its parametrization scales quadratically with block size, forcing each adapter into a low-dimensional rotation subspace. We introduce QuIC (Quantum-Inspired Compound) adapters, an extension of the orthogonal fine-tuning family that uses exterior algebra to derive larger orthogonal blocks from a small base rotation. At compound order one, QuIC reduces to OFT and BOFT; at higher orders, a smaller base rotation fills the same block, so parameter cost decreases as the compound order rises. We evaluate QuIC across vision, language, and reasoning benchmarks, where it matches or outperforms LoRA, OFT, and BOFT at $4$--$30{\times}$ fewer parameters. The compound order traces an explicit accuracy--parameter Pareto frontier, making QuIC a strong default whenever the multiplicative geometry of pretrained representations matters.
Submission Number: 27
Loading