Keywords: Computational neuroscience, neural data analysis, Bayesian nonparametrics, latent variable modelling;
TL;DR: We propose a fully Bayesian nonparametric extension of GPFA that enables discovery of temporally compositional neural manifolds underlying high-dimensional population neuronal activities.
Abstract: Gaussian Process Factor Analysis (GPFA) is a powerful latent variable model
for extracting low-dimensional manifolds underlying population neural
activities. However, one limitation of standard GPFA models is that the number
of latent factors needs to be pre-specified or selected through
heuristic-based processes, and that all factors contribute at all times. We
propose the infinite GPFA model, a fully Bayesian non-parametric extension of
the classical GPFA by incorporating an Indian Buffet Process (IBP) prior over
the factor loading process, such that it is possible to infer a potentially
infinite set of latent factors, and the identity of those factors that
contribute to neural firings in a compositional manner at \textit{each} time
point. Learning and inference in the infinite GPFA model is performed through
variational expectation-maximisation, and we additionally propose scalable
extensions based on sparse variational Gaussian Process methods. We
empirically demonstrate that the infinite GPFA model correctly infers
dynamically changing activations of latent factors on a synthetic dataset. By
fitting the infinite GPFA model to population activities of hippocampal place
cells during spatial tasks with alternating random foraging and spatial memory
phases, we identify novel non-trivial and behaviourally meaningful dynamics in
the neural encoding process.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5086
Loading