Neural Langevin Dynamics: Towards Interpretable Neural Stochastic Differential Equations

Published: 03 Nov 2023, Last Modified: 23 Dec 2023NLDL 2024EveryoneRevisionsBibTeX
Keywords: deep learning, stochastic differential equations, neural stochastic differential equations, generative modelling, time series, latent SDEs
TL;DR: We replace the drift in latent neural stochastic differential equations by the gradient of a learnable energy function to make the resulting equation more interpretable
Abstract: Neural Stochastic Differential Equations (NSDE) have been trained as both Variational Autoencoders, and as GANs. However, the resulting Stochastic Differential Equations can be hard to interpret or analyse due to the generic nature of the drift and diffusion fields. By restricting our NSDE to be of the form of Langevin dynamics and training it as a VAE, we obtain NSDEs that lend themselves to more elaborate analysis and to a wider range of visualisation techniques than a generic NSDE. More specifically, we obtain an energy landscape, the minima of which are in one-to-one correspondence with latent states underlying the used data. This not only allows us to detect states underlying the data dynamics in an unsupervised manner but also to infer the distribution of time spent in each state according to the learned SDE. In general, restricting an NSDE to Langevin dynamics enables the use of a large set of tools from computational molecular dynamics for the analysis of the obtained results.
Git: https://github.com/SimonKoop/NLD-public/
Permission: pdf
Submission Number: 34
Loading