Sliced Kernelized Stein DiscrepancyDownload PDF

28 Sept 2020, 15:52 (modified: 10 Feb 2022, 11:45)ICLR 2021 PosterReaders: Everyone
Keywords: kernel methods, variational inference, particle inference
Abstract: Kernelized Stein discrepancy (KSD), though being extensively used in goodness-of-fit tests and model learning, suffers from the curse-of-dimensionality. We address this issue by proposing the sliced Stein discrepancy and its scalable and kernelized variants, which employs kernel-based test functions defined on the optimal one-dimensional projections. When applied to goodness-of-fit tests, extensive experiments show the proposed discrepancy significantly outperforms KSD and various baselines in high dimensions. For model learning, we show its advantages by training an independent component analysis when compared with existing Stein discrepancy baselines. We further propose a novel particle inference method called sliced Stein variational gradient descent (S-SVGD) which alleviates the mode-collapse issue of SVGD in training variational autoencoders.
One-sentence Summary: We proposed a method to tackle the curse-of-dimensionality issue of kernelized Stein discrepancy with RBF kernel, along with a novel particle inference algorithm resolving the vanishing repulsive issue of Stein variational gradient descent.
Supplementary Material: zip
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Code: [![github](/images/github_icon.svg) WenboGong/Sliced_Kernelized_Stein_Discrepancy](https://github.com/WenboGong/Sliced_Kernelized_Stein_Discrepancy)
10 Replies

Loading