Keywords: operator learning, Bayesian optimization, scientific optimization, physics-informed operators, physics-informed neural networks, partial differential equations, deep ensemble, Gaussian process, scientific machine learning
TL;DR: Physics-informed Bayesian optimization: we propose pre-training physics-informed neural operator priors for composite Bayesian optimization, with applications to PDE-constrained scientific optimization and engineering problems
Abstract: Optimization problems in science and engineering often entail conducting expensive experiments and/or large-scale parametric partial differential equation simulations. Hence, methods that optimally trade off between the costs of ground-truth sampling and computation are invaluable. To this end, we propose a novel framework fusing pre-trained physics-informed operator priors with Bayesian optimization (BO). We experimentally demonstrate that our methods are complementary to and improve the sample efficiency and optimization performance of BO in any scientific optimization scenario, including single- and composite-objective problems with in-distribution and out-of-distribution optima. We observe over an order-of-magnitude improvement in mean-squared-error over baselines on large-scale linear and nonlinear test problems. Furthermore, we reveal that increasing effort to scale up physics-informed pre-training results in continuous improvements in sample efficiency, allowing significant freedom in trading off cost sources.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 19
Loading