Neural Nonmyopic Bayesian Optimization in Dynamic Cost Settings

Published: 06 Mar 2025, Last Modified: 24 Apr 2025FPI-ICLR2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: nonmyopic Bayesian optimization, dynamic cost settings, neural network policies, gaussian process, large language models
TL;DR: We perform nonmyopic Bayesian optimization in dynamic cost settings via neural network policy, validating on numerous synthetic and real-world settings
Abstract: Bayesian optimization (BO) is a popular framework for optimizing black-box functions, leveraging probabilistic models such as Gaussian processes. Conventional BO algorithms, however, assume static query costs, which limit their applicability to real-world problems with dynamic cost structures such as geological surveys or biological sequence design, where query costs vary based on the previous actions. We propose a novel nonmyopic BO algorithm named LookaHES featuring dynamic cost models to address this. LookaHES employs a neural network policy for variational optimization over multi-step lookahead horizons to enable planning under dynamic cost environments. Empirically, we benchmark LookaHES on synthetic functions exhibiting varied dynamic cost structures. We subsequently apply LookaHES to a real-world application in protein sequence design using a large language model policy, demonstrating its scalability and effectiveness in handling multi-step planning in a large and complex query space. LookaHES consistently outperforms its myopic counterparts in synthetic and real-world settings, significantly improving efficiency and solution quality. Our implementation is available at https://github.com/sangttruong/nonmyopia.
Submission Number: 92
Loading