Keywords: Neural propulation coding, Conditional Density Estimation, Gaussian processes, Variational inference, Probabilistic models
TL;DR: We introduce Continuous Multinomial Logistic Regression (CMLR), a Gaussian-process-regularized exponential-family model for scalable and flexible conditional density estimation in continuous neural decoding tasks.
Abstract: Multinomial logistic regression (MLR) is a classic model for multi-class classification that has been widely used for neural decoding. However, MLR requires a finite set of discrete output classes, limiting its applicability to settings with continuous-valued outputs (e.g., time, orientation, velocity, or spatial position). To address this limitation, we propose Continuous Multinomial Logistic Regression (CMLR), a generalization of MLR to continuous output spaces. CMLR defines a novel exponential-family model for conditional density estimation (CDE), mapping neural population activity to a full probability density over external covariates. It captures the influence of each neuron’s activity on the decoded variable through a smooth, interpretable tuning function, regularized by a Gaussian process prior. The resulting nonparametric decoding model flexibly captures a wide variety of conditional densities, including multimodal, asymmetric, and circular distributions. We apply CMLR to large-scale datasets from mouse and monkey primary visual cortex, mouse hippocampus, and monkey motor cortex, and show that it consistently outperforms a wide variety of other decoding methods, including deep neural networks (DNNs), XGBoost, and FlexCode. It also outperforms correlation-blind models such as Naive Bayes, highlighting the importance of correlations for accurate neural decoding. The CMLR model provides a scalable, flexible, and interpretable method for decoding responses from diverse brain regions.
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Submission Number: 22512
Loading