MoPEFT: A Mixture-of-PEFTs for the Segment Anything Model

Published: 01 Jan 2025, Last Modified: 06 Nov 2025CVPR Workshops 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The emergence of foundation models, such as the Segment Anything Model (SAM), has sparked interest in Parameter-Efficient Fine-Tuning (PEFT) methods that tailor these large models to specific application domains. Various PEFT techniques modify the representation of a model differently, making it a non-trivial task to select the most appropriate method for the domain of interest. We propose a new framework, Mixture-of-PEFT methods (MoPEFT), that is inspired by traditional Mixture-of-Experts (MoE) methodologies and is utilized for fine-tuning SAM. Our MoPEFT framework incorporates three different PEFT techniques as submodules and dynamically learns to activate the ones that are best suited for a given data-task setup. We use MoPEFT to fine-tune the Segment Anything Model and show that MoPEFT consistently outperforms other fine-tuning methods on the MESS benchmark.
Loading