Generative Manifold Networks for explainable prediction and simulation of complex system dynamics

ICLR 2026 Conference Submission18912 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Complex system generation, nonlinear dynamics, manifold learning
TL;DR: Generative manifold networks generate complex, nonlinear dynamics based solely on observables providing explainability.
Abstract: Generative Manifold Networks (GMN) are a new machine learning framework consisting of a network of linked dynamical systems capturing causal interactions at the core of complex systems. The network is discovered by an interaction function which can focus on causality, shared information, nonlinearity or other discrimination metric. Network nodes are interactive low-dimensional data--driven state space generators accommodating multiscale dynamics. In contrast to many machine learning approaches GMN has no latent or random variables, operates solely on observed time series and thus provides explainability. GMN generates short and long term chaotic dynamics on par with echo state networks but at a remarkably reduced number of dimensions and without sensitive dependence on reservoir parameters or random states. As a result of the multiscale representation GMNs are able to learn the complete dynamics of a complex system based on limited training data. We demonstrate these features on chaotic dynamics and neural and behavioral recordings of the fruit fly Drosophila melanogaster.
Primary Area: generative models
Submission Number: 18912
Loading