BARK: A Fully Bayesian Tree Kernel for Black-box Optimization

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: A fully Bayesian tree kernel provides a richer posterior, providing better probabilistic models for mixed-feature Bayesian optimization.
Abstract: We perform Bayesian optimization using a Gaussian process perspective on Bayesian Additive Regression Trees (BART). Our BART Kernel (BARK) uses tree agreement to define a posterior over piecewise-constant functions, and we explore the space of tree kernels using a Markov chain Monte Carlo approach. Where BART only samples functions, the resulting BARK model obtains samples of Gaussian processes defining distributions over functions, which allow us to build acquisition functions for Bayesian optimization. Our tree-based approach enables global optimization over the surrogate, even for mixed-feature spaces. Moreover, where many previous tree-based kernels provide uncertainty quantification over function values, our sampling scheme captures uncertainty over the tree structure itself. Our experiments show the strong performance of BARK on both synthetic and applied benchmarks, due to the combination of our fully Bayesian surrogate and the optimization procedure.
Lay Summary: Bayesian optimization (BO) is a powerful tool for optimizing unknown functions, such as the maximizing the yield of a product in a chemistry experiment. Many real-world problems have a mixed structure to - for example, a chemical reaction may have temperature as a continuous input, and choice of catalyst as a categorical input. Modeling and optimizing over these mixed spaces is an emerging field in BO. We propose a new model, BARK, that combines the modeling power of tree-based methods with the strong uncertainty quantification of Gaussian processes (GP), which are the typical model of choice in BO. By taking inspiration from Bayesian tree approaches, we improve over past work by capturing the uncertainty in the tree structure itself, which allows us to better understand which new potential experiments are worth exploring. Moreover, we can formulate the optimization as a 'mixed integer program', which allows the optimization to be solved globally, without needing gradients. We show competitive performance in a selection of synthetic and applied BO problems, outperforming current state-of-the-art tree based methods.
Link To Code: https://github.com/TobyBoyne/bark
Primary Area: Optimization->Zero-order and Black-box Optimization
Keywords: Bayesian optimization, tree kernels, Bayesian additive regression trees, Gaussian processes
Submission Number: 6969
Loading