MetaStain: Stain-generalizable Meta-learning for Cell Segmentation and Classification with Limited Exemplars
Abstract: Deep learning models excel when evaluated on test data that
share similar attributes and/or distribution with the training data. However, their ability to generalize may suffer when there are discrepancies in distributions between the training and testing data i.e. domain shift. In this work, we utilize meta-learning to introduce MetaStain, a stain-generalizable representation learning framework that performs cell
segmentation and classification in histopathology images. Owing to the designed episodical meta-learning paradigm, MetaStain can adapt to unseen stains and/or novel classes through finetuning even with limited annotated samples. We design a stain-aware triplet loss that clusters stain-
agnostic class-specific features, as well as separates intra-stain features extracted from different classes. We also employ a consistency triplet loss to preserve the spatial correspondence between tissues under different stains. During test-time adaptation, a refined class weight generator
module is optionally introduced if the unseen testing data also involves novel classes. MetaStain significantly outperforms state-of-the-art segmentation and classification methods on the multi-stain MIST dataset under various experimental settings.
Loading