Does Radiomic Segmentation Complexity Influence Foundation Model Performance? A Case Study with SAM-Med3D

11 Apr 2025 (modified: 12 Apr 2025)MIDL 2025 Short Papers SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Medical Image Segmentation, Foundation Model, Deep Learning, Machine Learning, Radiomic Features, 3D Medical Imaging, Artificial Intelligence.
TL;DR: Segmentation complexity, quantified through radiomic features, significantly conditions the performance of SAM-Med3D, enabling predictive insights into foundation model behavior across diverse 3D medical imaging tasks.
Abstract: The Segment Anything Model (SAM) has significantly expanded the application of foundation models in medical image segmentation. However, performance can vary significantly depending on the complexity of the segmentation task. This study examines how segmentation complexity, characterized through radiomic features, impacts the performance of SAM-Med3D in 3D medical imaging tasks. Specifically, it explores the relationship between segmentation complexity and model performance using five public datasets: MSD-Vessel, MSD-Colon, EPISURG, SPIDER, and PENGWIN. The analysis computed Intersection over Union (IoU) and Mean Surface Distance (MSD). Our results revealed that radiomic features such as mesh volume, sphericity, surface/volume ratio, and texture difference inside and outside the ROI significantly correlate with segmentation performance. Higher mesh volumes and lower surface/volume ratios were associated with better performance, suggesting that more compact and larger structures are segmented more accurately. These findings underscore the relevance of assessing the influence of segmentation complexity in medical imaging, as captured through radiomic features. This analysis provides valuable insights into the applicability of generalist models to specific tasks, based on the radiomic characteristics of the data.
Submission Number: 48
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview