UnCLe SAM: Unleashing SAM’s Potential for Continual Prostate MRI Segmentation

Published: 06 Jun 2024, Last Modified: 06 Jun 2024MIDL 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual learning, Foundation Model, Segment Anything Model
Abstract: Continual medical image segmentation primarily explores the utilization of U-Net and its derivatives within the realm of medical imaging, posing significant challenges in meeting the demands of shifting domains over time. Foundation models serve as robust knowledge repositories, offering unique advantages such as general applicability, knowledge transferability, and continuous improvements. By leveraging pre-existing domain insights, adaptability, generalization, and performance across diverse tasks can be enhanced. In this work, we show how to deploy Segment Anything Model's (SAM) natural image pretraining for the continual medical image segmentation, where data is sparse. We introduce UnCLe SAM, a novel approach that uses the knowledge of the pre-trained SAM foundation model to make it suitable for continual segmentation in dynamic environments. We demonstrate that UnCLe SAM is a robust alternative to U-Net-based approaches and showcase its state-of-the-art (SOTA) continual medical segmentation capabilities. The primary objective of UnCLe SAM is to strike a delicate balance between model rigidity and plasticity, effectively addressing prevalent pitfalls within CL methodologies. We assess UnCLe SAM through a series of prostate segmentation tasks, applying a set of different CL methods. Comparative evaluations against the SOTA Lifelong nnU-Net framework reveal the potential application of UnCLe SAM in dynamically changing environments like healthcare. Our code base will be made public upon acceptance.
Latex Code: zip
Copyright Form: pdf
Submission Number: 28
Loading