Subspace-Boosted Model Merging

ICLR 2026 Conference Submission12866 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: model merging
Abstract: Model merging enables the combination of multiple specialized expert models into a single model capable of performing multiple tasks. However, the benefits of merging an increasing amount of specialized experts generally lead to diminishing returns and reduced overall performance gains. In this work, we offer an explanation and analysis from a task arithmetic perspective; revealing that as the merging process (across numerous proposed merging heuristics) continues for more and more experts, the associated task vector space experiences rank collapse. By intervening on this task vector rank collapse through our newly introduced Subspace Boosting, which operates on the singular value decomposed task vector space, we maintain task vector ranks to raise merging efficacy on up to 20 expert models by large margins of more than 10%. Moreover, we showcase how a Higher-Order Generalized Singular Value Decomposition can be leveraged to further quantify task similarity, offering a new interpretable perspective on model merging.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 12866
Loading