Embedding cell state dynamics via contrastive learning of representations of 3D dynamic imaging datasets
Submission Track: Short papers presenting ongoing research or work submitted to other venues (up to 5 pages, excluding references)
Keywords: dynamic imaging; cell profiling; contrastive learning; nD imaging; single-cell phenotyping
TL;DR: DynaCLR integrates single-cell tracking and time-aware contrastive sampling to learn robust, temporally regularized representations of morphological dynamics.
Abstract: Robust and scalable profiling of cell state dynamics from large-scale 3D live cell imaging data is an open challenge. We propose a self-supervised method for embedding cell state dynamics via contrastive learning of representations (DynaCLR) to address this need. DynaCLR integrates single-cell tracking and time-aware contrastive sampling to learn robust, temporally regularized representations of morphological dynamics. This pretext task leads to an embedding space in which distances encode transitions in cell state dynamics. DynaCLR embeddings generalize to out-of-distribution imaging experiments, and can be used for multiple downstream tasks with sparse human annotations. DynaCLR embeddings enabled robust classification of cell infection and division, and clustering of heterogeneous cell migration behaviors. DynaCLR is a generalist method for comparative analyses of dynamic cellular responses to pharmacological, microbial, and genetic perturbations. We provide a PyTorch-based implementation of the method and a model library trained with 3D and 2D time-lapse datasets.
Submission Number: 65
Loading