Stein Random Feature Regression

Published: 26 Apr 2024, Last Modified: 15 Jul 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: gaussian processes, kernel learning, random fourier features, stein variational gradient descent
TL;DR: We apply Stein variational gradient descent for kernel approximation and Bayesian learning of kernel spectral measure posteriors.
Abstract: In large-scale regression problems, random Fourier features (RFFs) have significantly enhanced the computational scalability and flexibility of Gaussian processes (GPs) by defining kernels through their spectral density, from which a finite set of Monte Carlo samples can be used to form an approximate low-rank GP. However, the efficacy of RFFs in kernel approximation and Bayesian kernel learning depends on the ability to tractably sample the kernel spectral measure and the quality of the generated samples. We introduce Stein random features (SRF), leveraging Stein variational gradient descent, which can be used to both generate high-quality RFF samples of known spectral densities as well as flexibly and efficiently approximate traditionally non-analytical spectral measure posteriors. SRFs require only the evaluation of log-probability gradients to perform both kernel approximation and Bayesian kernel learning that results in superior performance over traditional approaches. We empirically validate the effectiveness of SRFs by comparing them to baselines on kernel approximation and well-known GP regression problems.
List Of Authors: Warren, Houston and Oliveira, Rafael and Ramos, Fabio
Latex Source Code: zip
Signed License Agreement: pdf
Code Url: https://github.com/houstonwarren/m-srfr
Submission Number: 720
Loading