Multi-task Learning with Iterative Training in Hybrid Labeling Dataset for Semi-supervised Abdominal Multi-organ and Tumor Segmentation

13 Sept 2023 (modified: 22 Dec 2023)Submitted to FLARE 2023EveryoneRevisionsBibTeX
Keywords: Segmentation, Multi-task learning, Semi-supervised learning
TL;DR: We propose a semi-supervised multi-task learning approach with iterative training and hybrid labeling dataset for abdominal organ and tumor segmentation in CT images.
Abstract: Simultaneous segmentation of organs and tumors from abdominal CT images is challenging, and the task has many critical clinical applications such as disease diagnosis, lesion and organ measurements, and surgical planning. Based on nnU-Net, we develop a method for abdominal organ and whole-body pan-tumor segmentation for both abdominal and whole-body CT images. First, in a fully supervised setting, we train the base models of organs and tumors to generate initial pseudo-labels. Then, in a semi-supervised setting, a mixed-labeled dataset is used to iteratively train a higher-performance segmentation model to create higher-quality pseudo-labels. Due to the correlation between organs and tumors in the abdominal region, we leverage the idea of multi-task learning to train a single model to segment both organs and tumors to improve the performance of a single task. Finally, to trade off segmentation efficiency and accuracy, we design a sliding window strategy based on the body prior and a simplified version of test-time augmentation (TTA4). Our final model achieved 88.93\% mean organ DSC and 45.76\% tumor DSC on the FLARE23 online validation set. In addition, the average running time and area under GPU memory-time curve were 26.7s and 49352.9MB, respectively. On the test set, we achieved mean organ and tumor DSC of 89.68\% and 62.89\%, respectively, NSD of 95.89\% and 51.69\%, respectively, and average inference time of 18.53s. Our code is publicly available at https://github.com/LeoZhong997/FLARE23.
Supplementary Material: zip
Submission Number: 29
Loading