High Dimensional Bayesian Optimization with Reinforced Transformer Deep KernelsDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Bayesian Optimization, Reinforcement Learning, Deep Kernel Learning
TL;DR: Transformer Deep Kernels combined with general combination gaussian process kernels help optimize high dimensional functions when using reinforcement learning acquisitions for exploration.
Abstract: Bayesian Optimization (BO) has proved to be an invaluable technique for efficient, high-dimensional optimization. The use of Gaussian process (GP) surrogates and dynamic acquisition functions has allowed BO to shine in challenging high dimensional optimization due to its sample efficiency and uncertainty modeling. Reinforcement Learning has been introduced to improve optimization performance on both single function optimization as well as \textit{few-shot} multi-objective optimization. However, until now, even few-shot techniques treat each objective as independent optimization tasks, failing to exploit the similarities shared between objectives. We combine recent developments in Deep Kernel Learning (DKL) and attention-based Transformer models to improve the modeling powers of GP surrogates with meta-learning. We propose a method for improving meta-learning BO surrogates by incorporating attention mechanisms into DKL, empowering the surrogates to adapt to contextual information gathered during the BO process. This Transformer Deep Kernel is combined with Reinforcement Learning techniques to aid in exploration, ensuring state-of-the-art results on a variety of high-dimensional optimization problems.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Optimization (eg, convex and non-convex optimization)
6 Replies

Loading