Generative Pretraining for Black-Box OptimizationDownload PDF

05 Oct 2022 (modified: 29 Sept 2024)FMDM@NeurIPS2022Readers: Everyone
Keywords: decision making, generative modelling, transformers
Abstract: Many problems in science and engineering involve optimizing an expensive black-box function over a high-dimensional space. For such black-box optimization (BBO) problems, we typically assume a small budget for online function evaluations, but also often have access to a fixed, offline dataset for pretraining. Prior approaches seek to utilize the offline data to approximate the function or its inverse but are not sufficiently accurate far from the data distribution. We propose BONET, a generative framework for pretraining a novel black-box optimizer using offline datasets. In BONET, we train an autoregressive model on fixed-length trajectories derived from an offline dataset. We design a sampling strategy to synthesize trajectories from offline data using a simple heuristic of rolling out monotonic transitions from low-fidelity to high-fidelity samples. Empirically, we instantiate BONET using a causally masked transformer and evaluate it on Design-Bench, where we rank the best on average, outperforming state-of-the-art baselines
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 6 code implementations](https://www.catalyzex.com/paper/generative-pretraining-for-black-box/code)
0 Replies

Loading