Hybrid Kernel Stein Variational Gradient Descent

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Stein Variational Gradient Descent, Approximate Inference, Particle-based Variational Inference
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Stein variational gradient descent (SVGD) is a particle based approximate inference algorithm with largely well understood theoretical properties. In recent years, many variants of SVGD have been proposed and shown to share those properties. A preliminary test of the hybrid kernel variant (h-SVGD) has demonstrated promising results on image classification with deep neural network ensembles. However, the theoretical properties of h-SVGD have not yet been established, and its practical advantages have not been fully explored. In this paper, we define a hybrid kernelised Stein discrepancy (h-KSD) and prove that the h-SVGD update direction is optimal within an appropriate reproducing kernel Hilbert space. We also prove a descent lemma that guarantees a decrease in the KL divergence at each step along with other limit results. Numerical results demonstrate that h-SVGD mitigates the variance collapse behaviour of SVGD at no additional computational cost whilst remaining competitive at inference tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2183
Loading