Keywords: Self-supervised representation learning, Computational Pathology
TL;DR: We introduce LEMON (Learning Embeddings from Morphology Of Nuclei), a self-supervised foundation model for scalable single-cell image representation.
Abstract: Representation learning is a central challenge in Computational Pathology (CP), with direct implications for cancer research and precision medicine. While Self Supervised Learning (SSL) has advanced patch and slide-level analysis of Whole-Slide Images (WSIs), single-cell representation learning has remained underexplored, despite its importance for characterizing cell types and phenotypes. We introduce LEMON (Learning Embeddings from Morphology Of Nuclei), a self-supervised foundation model for scalable single-cell image representation. Trained on millions of cell images spanning diverse tissues and cancer types, LEMON provides versatile and robust morphology representations that enable large-scale single-cell studies in pathology. We demonstrate its effectiveness across diverse prediction tasks on five benchmark datasets, establishing LEMON as a new paradigm for cell-level computational pathology.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 18058
Loading