BA-SAM: Boundary-Aware Adaptation of Segment Anything Model for Medical Image Segmentation

Published: 01 Jan 2024, Last Modified: 03 Mar 2025BIBM 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The Segment Anything Model (SAM) has demonstrated remarkable capabilities in its performance on natural images. However, it faces considerable challenges when applied to medical datasets. Specifically, the performance of vanilla SAM is degraded and lacks generalisability when processing medical images with large domain gaps. What’s worse, many medical segmentation tasks highly demand accurate boundary identification, while existing SAM variants struggle with this need. To overcome the above challenges, we propose BA-SAM, a Segment Anything Model variant that can achieve better performance on medical images. Specifically, based on the idea of parameter-efficient fine-tuning (PEFT), we first add a parallel tuneable CNN encoder to better extract local details using convolutional operations, while most parts of the original ViT encoder in SAM are set frozen. Moreover, we use a Boundary-Aware Attention (BAA) module in the CNN-Branch to encourage the framework to better capture boundary-related features. Extensive experiments on three public datasets demonstrate that the proposed BA-SAM further improvements over existing state-of-the-art methods.
Loading