Resampling Gradients Vanish in Differentiable Sequential Monte Carlo SamplersDownload PDF

01 Mar 2023 (modified: 20 Apr 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: Sequential Monte Carlo, Annealed Importance Sampling, Gradient Variance
TL;DR: We propose a differentiable Sequential Monte Carlo Sampler and show how to avoid large gradient variance.
Abstract: Annealed Importance Sampling (AIS) moves particles along a Markov chain from a tractable initial distribution to an intractable target distribution. The recently proposed Differentiable AIS (DAIS) (Geffner & Domke, 2021; Zhang et al., 2021) enables efficient optimization of the transition kernels of AIS and of the distributions. However, we observe a low effective sample size in DAIS, indicating degenerate distributions. We thus propose to extend DAIS by a resampling step inspired by Sequential Monte Carlo. Surprisingly, we find empirically—and can explain theoretically—that it is not necessary to differentiate through the resampling step which avoids gradient variance issues observed in similar approaches for Particle Filters (Maddison et al., 2017; Naesseth et al., 2018; Le et al., 2018).
10 Replies

Loading