Targeted Separation and Convergence with Kernel DiscrepanciesDownload PDF

Published: 29 Nov 2022, Last Modified: 05 May 2023SBM 2022 OralReaders: Everyone
Keywords: reproducing kernel, Stein discrepancy, separating measures, control weak convergence, Maximum mean discrepancy
Abstract: Kernel Stein discrepancies (KSDs) are maximum mean discrepancies (MMDs) that leverage the score information of distributions, and have grown central to a wide range of applications. In most settings, these MMDs are required to $(i)$ separate a target $\mathrm{P}$ from other probability measures or even $(ii)$ control weak convergence to $\mathrm{P}$. In this article we derive new sufficient and necessary conditions that substantially broaden the known conditions for KSD separation and convergence control, and develop the first KSDs known to metrize weak convergence to $\mathrm{P}$. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.
Student Paper: No
1 Reply

Loading