Towards Characterizing the High-dimensional Bias of Kernel-based Particle Inference AlgorithmsDownload PDF

Oct 16, 2019 (edited Dec 07, 2019)AABI 2019 Symposium Blind SubmissionReaders: Everyone
  • Keywords: SVGD, MMD, Particle inference algorithms, Stein's Lemma, Kernel methods
  • TL;DR: Analyze the underlying mechanisms of variance collapse of SVGD in high dimensions.
  • Abstract: Particle-based inference algorithm is a promising method to efficiently generate samples for an intractable target distribution by iteratively updating a set of particles. As a noticeable example, Stein variational gradient descent (SVGD) provides a deterministic and computationally efficient update, but it is known to underestimate the variance in high dimensions, the mechanism of which is poorly understood. In this work we explore a connection between SVGD and MMD-based inference algorithm via Stein's lemma. By comparing the two update rules, we identify the source of bias in SVGD as a combination of high variance and deterministic bias, and empirically demonstrate that the removal of either factors leads to accurate estimation of the variance. In addition, for learning high-dimensional Gaussian target, we analytically derive the converged variance for both algorithms, and confirm that only SVGD suffers from the "curse of dimensionality".
0 Replies