Structure-Preserving Machine Learning of Dynamical Systems: A Case for Smaller Models

ICLR 2026 Conference Submission17244 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: generalization, dynamical systems, non-Euclidean geometry, structure-preservation, small models, scientific machine learning
TL;DR: In this paper, we demonstrate that learning dynamical systems on structure-rich manifolds reduces reliance on model-size resulting in more stable roll-out generalization across varying initial conditions.
Abstract: Dynamical systems naturally evolve on structure-rich manifolds, yet naive machine learning models learn dynamics in flat Euclidean embeddings. This mismatch forces models to implicitly learn geometric constraints, resulting in data-intensive training and limited generalization across operating conditions. In this work, we demonstrate how leveraging geometry-informed inductive biases reduces the dependency on larger models to achieve robust generalisation. We investigate a dissipative and a conservative system as use-cases. In the dissipative case, we identify a 2-dimensional heat transfer system using a linear state-space formulation where the state operator is constrained to be symmetric positive definite via Riemannian optimization. In the conservative case, we model an 18-dimensional Fermi-Pasta-Ulam-Tsingou (FPUT) system on its native symplectic manifold using a symplectic Hamiltonian neural network (SHNN). In the latter case we reveal how structurally-naive models suffer from energy drift when referenced against the true energy surface leading to fragile roll-out generalization, unlike SHNNs which conserve phase-space volume along the correct energy level.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 17244
Loading