Keywords: Bayesian Optimization, Bounded Domain, Kernels
TL;DR: Covariance functions for Bayesian optimization are typically defined over unbounded domains, while we define a natural bounded domain kernel that works better on optimizing model compression.
Abstract: Bayesian optimization with Gaussian processes (GP) is commonly used to optimize black-box functions. The Matérn and the Radial Basis Function (RBF) covariance functions are used frequently, but they do not make any assumptions about the domain of the function, which may limit their applicability in bounded domains. To address the limitation, we introduce the Beta kernel, a non-stationary kernel induced by a product of Beta distribution density functions. Such a formulation allows our kernel to naturally model functions on bounded domains. We present statistical evidence supporting the hypothesis that the kernel exhibits an exponential eigendecay rate, based on empirical analyses of its spectral properties across different settings. Our experimental results demonstrate the robustness of the Beta kernel in modeling functions with optima located near the faces or vertices of the unit hypercube. The experiments show that our kernel consistently outperforms a wide range of kernels, including the well-known Matérn and RBF, in different problems, including synthetic function optimization and the compression of vision and language models.
Supplementary Material: zip
Latex Source Code: zip
Code Link: https://github.com/imedslab/BetaKernel
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission432/Authors, auai.org/UAI/2025/Conference/Submission432/Reproducibility_Reviewers
Submission Number: 432
Loading