Multi-fidelity Bayesian Optimization with Multiple Information Sources of Input-dependent Fidelity

Published: 26 Apr 2024, Last Modified: 15 Jul 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian Optimization, Multi-Fidelity BO
TL;DR: We investigate MFBO when multi-fidelity approximations have input-dependent fidelity, and designed a new MFBO methods by capturing input dependency for multi-fidelity queries then demonstrate its superiority both theoretically and experimentally.
Abstract: By querying approximate surrogate models of different fidelity as available information sources, Multi-Fidelity Bayesian Optimization (MFBO) aims at optimizing unknown functions that are costly if not infeasible to evaluate. Existing MFBO methods often assume that approximate surrogates have consistently high/low fidelity across the input domain. However, approximate evaluations from the same surrogate can have different fidelity at different input regions due to data availability and model constraints, especially when considering machine learning surrogates. In this work, we investigate MFBO when multi-fidelity approximations have input-dependent fidelity. By explicitly capturing input dependency for multi-fidelity queries in Gaussian Process (GP), our new input-dependent MFBO~(iMFBO) with learnable noise models better captures the fidelity of each information source in an intuitive way. We further design a new acquisition function for iMFBO and prove that the queries selected by iMFBO have higher quality than those by naive MFBO methods, with the derived sub-linear regret bound. Experiments on both synthetic and real-world data demonstrate its superior empirical performance.
Supplementary Material: zip
List Of Authors: Fan, Mingzhou and Yoon, Byung-Jun and Dougherty, Edward and Urban, Nathan and Alexander, Francis and Arr\'oyave, Raymundo and Qian, Xiaoning
Latex Source Code: zip
Signed License Agreement: pdf
Submission Number: 296
Loading