MARKED INDUCING POINT CASCADED SDES FOR NEURAL MANIFOLD LEARNING

ICLR 2026 Conference Submission12655 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural manifold learning, Cascaded stochastic differential equations (SDEs), Marked inducing points
TL;DR: We introduce MIP-CSDE, a cascaded SDE model with marked inducing points that efficiently uncovers low-dimensional neural manifolds from high-dimensional time series.
Abstract: The manifold hypothesis suggests that high-dimensional neural time series lie on a low-dimensional manifold shaped by simpler underlying dynamics. To uncover this structure, latent dynamical variable models such as state-space models, recur-rent neural networks, neural ordinary differential equations, and Gaussian process latent variable models are widely used. We propose MIP-CSDE (Marked Inducing Point Cascaded SDE), a novel cascaded stochastic differential equation model that balances computational efficiency with interpretability and addresses key limitations of existing approaches. Our model assumes that a sparse set of trajectory samples suffices to reconstruct the underlying smooth manifold. The manifold dynamic is modeled using a set of Brownian bridge SEDs, with points–specified in both time and value–drawn from a multivariate marked point process. These Brownian bridges define the drift of a second set of SDEs, where their trajectories are mapped to the observed data. This yields a continuous, differentiable latent process capable of modeling arbitrarily complex time series as the number of inducing points increases. For MIP-CSDE, we derive efficient training and inference procedures, demonstrating that its computational complexity of inference per iteration scales as O(P · N ), exhibiting linear dependence on the observation data length N , where P is the number of particles. We then show in both synthetic data and neural recordings that our proposed model can accurately recovers the underlying manifold structure and scales effectively with data dimensionality.
Primary Area: applications to neuroscience & cognitive science
Submission Number: 12655
Loading