Know Thyself by Knowing Others: Learning Neuron Identity from Population Context

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY-NC 4.0
Keywords: neural identity, cell type identification, brain region identification, computational neuroscience, systems neuroscience, self-supervised learning, contrastive learning
TL;DR: We present NuCLR, a self-supervised framework that learns high-quality, population-aware neuron-level embeddings directly from spike train data using a spatio-temporal transformer and tailored contrastive loss.
Abstract: Identifying the functional identity of individual neurons is essential for interpreting circuit dynamics, yet it remains a major challenge in large-scale _in vivo_ recordings where anatomical and molecular labels are often unavailable. Here we introduce NuCLR, a self-supervised framework that learns context-aware representations of neuron identity by modeling each neuron's role within the broader population. NuCLR employs a spatio-temporal transformer that captures both within-neuron dynamics and across-neuron interactions. It is trained with a sample-wise contrastive objective that encourages temporally-stable and discriminative embeddings. Across multiple open-access datasets, NuCLR outperforms prior methods in both cell type and brain region classification. Critically, it exhibits strong zero-shot generalization to entirely new populations, without any retraining or access to stimulus labels. Furthermore, we demonstrate that our framework scales effectively with data size. Overall, our results demonstrate that modeling population context is crucial for understanding neuron identity and that rich signal for cell-typing and neuron localization is present in neural activity alone. Code available at: https://github.com/nerdslab/nuclr.
Primary Area: Neuroscience and cognitive science (e.g., neural coding, brain-computer interfaces)
Submission Number: 23348
Loading