Keywords: bilevel, Bayesian optimization
Abstract: Bilevel optimization, characterized by a two-level hierarchical optimization structure, is prevalent in real-world problems but poses significant challenges, especially in noisy, constrained, and derivative-free settings. To tackle these challenges, we present a novel algorithm for BILevel Bayesian Optimization (BILBO) that optimizes both upper- and lower-level problems jointly in a sample-efficient manner by using confidence bounds to construct trusted sets of feasible and lower-level optimal solutions. We show that sampling from our trusted sets guarantees points with instantaneous regret bounds. Moreover, BILBO selects only one function query per iteration, facilitating its use in decoupled settings where upper- and lower-level function evaluations may come from different simulators or experiments. We also show that this function query selection strategy leads to an instantaneous regret bound for the query point. The performance of BILBO is theoretically guaranteed with a sublinear regret bound and is empirically evaluated on several synthetic and real-world problems.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 10780
Loading