Sample-Efficient Bayesian Optimization with Transfer Learning for Heterogeneous Search Spaces

Published: 12 Jul 2024, Last Modified: 13 Aug 2024AutoML 2024 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian optimization
Abstract: Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-box functions. However, in settings with very few function evaluations, a successful application of BO may require transferring information from historical experiments. These related experiments may not have exactly the same tunable parameters (search spaces), motivating the need for BO with transfer learning for heterogeneous search spaces. In this paper, we propose two methods for this setting. The first approach leverages a Gaussian process (GP) model with a conditional kernel to transfer information between different search spaces. Our second approach treats the missing parameters as hyperparameters of the GP model that can be inferred jointly with the other GP hyperparameters or set to fixed values. We show that these two methods perform well on several benchmark problems.
Submission Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Optional Meta-Data For Green-AutoML: All questions below on environmental impact are optional.
Steps For Environmental Footprint Reduction During Development: We tried our best to minimize the amount of compute used while still obtaining statistically significant results
CPU Hours: 156
GPU Hours: 0
TPU Hours: 0
Evaluation Metrics: No
Submission Number: 7
Loading