Keywords: Generative moels, Physical Systems
TL;DR: We propose a generative model that addresses the challenge of learning nonlinear Physical systems by modeling latent spaces.
Abstract: We propose a generative framework for learning nonlinear Physical systems, with an emphasis on scalable training and principled
stability guarantees. Our approach provides a unified way to compute required gradients in closed form, with tailored gradient flow
calculations for both continuous and discrete components, yielding a framework that is both theoretically grounded and
practically efficient. To address recurrent instabilities, we develop a general input-to-state stability analysis applicable to a broad class of gated RNN architectures under bounded inputs, extending beyond existing restricted settings. Building on this foundation, inference is carried out by parameterizing continuous latent states with recurrent networks in the spirit of Gaussian filtering, while discrete latent dynamics are inferred through conditional neural sampling. This joint design enables end-to-end learning of complex temporal structure without reliance on restrictive Markovian transition assumptions. Experiments on synthetic benchmarks and real-world physical dynamical systems demonstrate that our method achieves strong performance in state estimation,
regime detection, and imputation under noise and partial observability.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 4761
Loading