Keywords: Bayesian optimization, Tabular Foundation Models
Abstract: Existing high-dimensional Bayesian optimization (BO) methods typically leverage low-dimensional embeddings or structural assumptions to mitigate the curse of dimensionality, yet these approaches frequently incur considerable computational overhead due to iterative surrogate retraining and fixed assumptions. To address these limitations, we propose Gradient-Informed Bayesian Optimization using Tabular Foundation Models (GIT-BO), an approach that uses a pre-trained tabular foundation model (TFM) as a surrogate, leveraging its gradient information to adaptively identify low-dimensional subspaces for optimization. We propose a way to exploit internal gradient computations from the TFM's forward pass by creating a gradient-informed diagnostic matrix that reveals the most sensitive directions of the TFM's predictions, enabling optimization in a continuously re-estimated active subspace without the need for repeated model retraining. Extensive empirical evaluation across 23 synthetic and real-world benchmarks demonstrates that GIT-BO consistently outperforms four state-of-the-art Gaussian process-based high-dimensional BO methods, showing superior scalability and optimization performances, especially as dimensionality increases up to 500 dimensions.
Submission Number: 69
Loading