Scalable Gradients and Variational Inference for Stochastic Differential EquationsDownload PDF

16 Oct 2019 (modified: 27 Jan 2020)AABI 2019 Symposium Blind SubmissionReaders: Everyone
  • Keywords: variational inference, stochastic variational inference, stochastic differential equation, adjoint sensitivity method, latent variable model
  • TL;DR: We present a constant memory gradient computation procedure through solutions of stochastic differential equations (SDEs) and apply the method for learning latent SDE models.
  • Abstract: We derive reverse-mode (or adjoint) automatic differentiation for solutions of stochastic differential equations (SDEs), allowing time-efficient and constant-memory computation of pathwise gradients, a continuous-time analogue of the reparameterization trick. Specifically, we construct a backward SDE whose solution is the gradient and provide conditions under which numerical solutions converge. We also combine our stochastic adjoint approach with a stochastic variational inference scheme for continuous-time SDE models, allowing us to learn distributions over functions using stochastic gradient descent. Our latent SDE model achieves competitive performance compared to existing approaches on time series modeling.
0 Replies