Tensor Brain: Structured Probabilistic Modeling for Neural Population Activity with Tensor Networks

06 May 2026 (modified: 09 May 2026)ICML 2026 Workshop CoLoRAI SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Tensor networks, probabilistic modeling, neural population activity
TL;DR: Tensor Brain is an explicit tensor-network probabilistic model for neural spike trains that enables stable likelihood evaluation and generation of neural population activity.
Abstract: Probabilistic modeling of neural population activity requires representing the high-dimensional joint distribution of coordinated spike trains across neurons and time. However, the exponential growth of the joint state space makes direct modeling computationally intractable, and most existing approaches rely on latent variables without explicitly parameterizing the full joint distribution. We propose tensor brain (TB), a tensor-network-based probabilistic framework that factorizes the full joint distribution into structured tensor contractions. TB decomposes neural activity into a tree tensor network for spatial interactions and a matrix product state for temporal dependencies, enabling tractable joint modeling without introducing latent variables and supporting exact likelihood evaluation. To ensure numerical stability in long tensor contractions, we derive a scaled contraction scheme and optimize core tensors under isometric constraints on the Stiefel manifold. Experiments on synthetic data, rat PFC recordings, and the MC Maze dataset show that TB reliably captures population spike count statistics and provides competitive sample-based performance on real neural recordings, while also supporting exact likelihood evaluation. These results demonstrate that TB provides an efficient and principled explicit probabilistic framework for modeling neural population activity.
Submission Number: 47
Loading