Amortized Reparametrization: Efficient and Scalable Variational Inference for Latent SDEs

Published: 21 Sept 2023, Last Modified: 11 Jan 2024NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: variational inference, differential equations, dynamical systems, neural ordinary differential equations, latent stochastic differential equations
TL;DR: We present a new method for training latent stochastic differential equations far more quickly than methods based on adjoints using a new amortization strategy.
Abstract: We consider the problem of inferring latent stochastic differential equations (SDEs) with a time and memory cost that scales independently with the amount of data, the total length of the time series, and the stiffness of the approximate differential equations. This is in stark contrast to typical methods for inferring latent differential equations which, despite their constant memory cost, have a time complexity that is heavily dependent on the stiffness of the approximate differential equation. We achieve this computational advancement by removing the need to solve differential equations when approximating gradients using a novel amortization strategy coupled with a recently derived reparametrization of expectations under linear SDEs. We show that, in practice, this allows us to achieve similar performance to methods based on adjoint sensitivities with more than an order of magnitude fewer evaluations of the model in training.
Submission Number: 8107
Loading