Thompson Sampling in Function Spaces via Neural Operators

Published: 09 Jun 2025, Last Modified: 13 Jul 2025ICML 2025 Workshop SIM PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: neural operators, Thompson sampling, bandits, Bayesian optimization
TL;DR: We propose Neural Operator Thompson Sampling (NOTS), a method that efficiently optimizes functionals of unknown operators in function spaces by using neural operators as surrogate models within a Thompson sampling framework.
Abstract: We propose an extension of Thompson sampling to optimization problems over function spaces where the objective is a known functional of an unknown operator's output. We assume that functional evaluations are inexpensive, while queries to the operator (such as running a high-fidelity simulator) are costly. Our algorithm employs a sample-then-optimize approach using neural operator surrogates. This strategy avoids explicit uncertainty quantification by treating trained neural operators as approximate samples from a Gaussian process. We provide novel theoretical convergence guarantees based on Gaussian processes in the infinite-dimensional setting, under minimal assumptions. We benchmark our method against existing baselines on functional optimization tasks involving partial differential equations and other nonlinear operator-driven phenomena, demonstrating improved sample efficiency and competitive performance.
Submission Number: 33
Loading