Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized Stein DiscrepancyDownload PDF

Published: 29 Nov 2022, Last Modified: 17 Nov 2024SBM 2022 PosterReaders: Everyone
Keywords: goodness-of-fit testing, Stein's method, kernel method
TL;DR: We make precise a known low-test-power issue of GOF tests based on KSD when the null distribution has well-separated modes, and propose to perturb the null and alternative distributions with a Markov transition kernel to boost the test power..
Abstract: Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely employed in goodness-of-fit tests. It is applicable even when the target distribution has an unknown normalising factor, such as in Bayesian analysis. We show theoretically and empirically that the power of the KSD test can be low when the target distribution has well-separated modes, which is due to insufficient data in regions where the score functions of the alternative and the target distributions differ the most. To improve its test power, we propose to perturb the target and alternative distributions before applying the KSD test. The perturbation uses a Markov transition kernel that leaves the target invariant but perturbs alternatives. We provide numerical evidence that the proposed approach can lead to a substantially higher power than the KSD test when the target and the alternative are mixture distributions that differ only in mixing weights.
Student Paper: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/using-perturbation-to-improve-goodness-of-fit/code)
1 Reply

Loading