Is the information geometry of probabilistic population codes learnable?Download PDF

26 Sept 2022, 12:09 (modified: 09 Nov 2022, 02:12)NeurReps 2022 OralReaders: Everyone
Keywords: Probabilistic population codes, information geometry, manifold learning
TL;DR: The latent manifold of PPCs is a statistical manifold, and its metric can be obtained by measuring covariance matrices.
Abstract: One reason learning the geometry of latent neural manifolds from neural activity data is difficult is that the ground truth is generally not known, which can make manifold learning methods hard to evaluate. Probabilistic population codes (PPCs), a class of biologically plausible and self-consistent models of neural populations that encode parametric probability distributions, may offer a theoretical setting where it is possible to rigorously study manifold learning. It is natural to define the neural manifold of a PPC as the statistical manifold of the encoded distribution, and we derive a mathematical result that the information geometry of the statistical manifold is directly related to measurable covariance matrices. This suggests a simple but rigorously justified decoding strategy based on principal component analysis, which we illustrate using an analytically tractable PPC.
4 Replies