Continuous-time Particle Filtering for Latent Stochastic Differential Equations

TMLR Paper2519 Authors

13 Apr 2024 (modified: 18 Apr 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Particle filtering is a standard Monte-Carlo approach for a wide range of sequential inference tasks. The key component of a particle filter is a set of particles with importance weights that serve as a proxy of the true posterior distribution of some stochastic process. In this work, we propose continuous latent particle filters, an approach that extends particle filtering to the continuous-time domain of latent neural stochastic differential equations. We demonstrate how continuous latent particle filters can be used as a generic plug-in replacement for inference techniques relying on a learned variational posterior. Our experiments with different model families based on latent neural stochastic differential equations demonstrate superior performance of continuous-time particle filtering in inference tasks like likelihood estimation and sequential prediction for a variety of synthetic and real-world data.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~George_Papamakarios1
Submission Number: 2519
Loading