A Noisy nnU-Net Student for Semi-supervised Abdominal Organ SegmentationDownload PDF

23 Jul 2022 (modified: 05 May 2023)MICCAI 2022 Challenge FLARE SubmissionReaders: Everyone
Keywords: Organ Segmentation, Self-training, Noisy student
TL;DR: A simple extension to the nnU-Net framework, which makes use of noisy student training in the context of the 2022 MICCAI FLARE challenge.
Abstract: While deep learning methods have shown great potential in the context of medical image segmentation, it remains both time-consuming and expensive to collect sufficient data with expert annotations required to train large neural networks effectively. However, large amounts of unlabeled medical image data is available due to the rapid growth of digital healthcare and the increase in availability of imaging devices. This yields great potential for methods which can exploit large unlabeled image datasets to improve sample efficiency on downstream tasks with limited amounts of labeled data. At the same time, deploying such models in real-world scenarios poses some limitations in terms of model size and required compute resources during inference. The 2022 MICCAI FLARE Challenge tries to address both these aspects in a task where participants can make use of 2000 unlabeled and 50 labeled images, while also measuring inference speed, CPU utilization and GPU memory as part of the evaluation metrics. In the context of this challenge, we propose a simple method to make use of unlabeled data: The noisy nnU-Net student. Here the unlabeled data is exploited through self-training, where a teacher model creates pseudo labels, which in turn are used to improve a student model of the same architecture. We show, based on results in a cross-validation and a separate held-out dataset, that this simple method yields improvements over even a strong baseline (+2 DSC), while simultaneously reducing inference time by an order of magnitude, from an average of over 500s to roughly 50s, and peak memory requirements by almost a factor of two.
9 Replies

Loading