Abstract: Semi-supervised learning has made significant progress in medical image segmentation, aiming to improve model performance with small amounts of labeled data and large amounts of unlabeled data. However, most existing methods focus too much on the supervision of label space and have insufficient supervision on feature space. Moreover, these methods generally focus on enhancing inter-class discrimination, ignoring the processing of intra-class variation, which has significant effects on fine-grained segmentation in complex medical images. To overcome these limitations, we propose a novel semi-supervised segmentation approach, Prototype-Augmented Mean Teacher (PAMT). Built upon the Mean Teacher framework, PAMT incorporates non-learnable prototypes to enhance feature space supervision. Specifically, we introduce two innovative loss functions: Prototype-Guided Pixel Classification (PGPC) Loss and Adaptive Prototype Contrastive (APC) Loss. PGPC Loss ensures pixel classification consistency with the nearest prototypes through a nearest-neighbor strategy, while APC Loss further captures intra-class variability, thereby improving the model's capacity to distinguish between pixels of the same class. By augmenting the Mean Teacher framework with prototype learning, PAMT not only improves feature representation and mitigates pseudo-label noise but also boosts segmentation accuracy and generalization, particularly in complex anatomical structures. Extensive experiments on three public datasets demonstrate that PAMT consistently surpasses state-of-the-art methods.
Loading