Semi-Supervised Learning Based Cascaded Pocket U-Net for Organ and Pan-cancer Segmentation in Abdomen CT

10 Sept 2023 (modified: 22 Dec 2023)Submitted to FLARE 2023EveryoneRevisionsBibTeX
Keywords: Organ and pan-cancer segmentation, Semi-supervised learning, Minimal parameter size
Abstract: In clinical practice, CT scans are frequently employed as the primary imaging modality for detecting prevalent tumors arising from the abdominal organs. Hence, the accomplishment of simultaneous organ segmentation and pan-cancer segmentation in abdominal CT scans holds significant importance in decreasing the workload of clinical practitioners. To maximize the utilization of partially labeled and unlabeled data, a iterative training strategy through a semi-supervised approach based on pseudo labels is employed in this work. Furthermore, to reduce parameter size of model and increase efficiency of GPU utilization, the proposed method is built upon the pocket U-Net architecture. The methodology involves a cascaded network consisting of two parts: initially, a segmentation network trained on labeled data refines the low-resolution pocket U-Net to reduce image dimensions; subsequently, the high-resolution pocket U-Net conducts intricate segmentation to precisely delineate organ and tumor regions. As demonstrated by the evaluation outcomes on the FLARE 2023 validation dataset, the proposed method achieves an average dice similarity coefficient (DSC) of 88.94\% for organs and 15.92\% for tumors, along with normalized surface dice (NSD) values of 93.31\% for organs and 0.0816\% for tumors, with minimal parameter size. Furthermore, the average inference time is 82.61 seconds, with an average maximum GPU memory usage of 3560M. Codes are available at https://github.com/wt812549723/FLARE2023_solution.
Supplementary Material: zip
Submission Number: 19
Loading