ExPT: Synthetic Pretraining for Few-Shot Experimental Design

Published: 07 Nov 2023, Last Modified: 19 Nov 2023FMDM@NeurIPS2023EveryoneRevisionsBibTeX
Keywords: experimental design, black-box optimization, foundation model, transformers, synthetic pretraining
TL;DR: We propose a foundation transformers model for experimental design
Abstract: Experimental design is a fundamental problem in many science and engineering fields. In this problem, sample efficiency is crucial due to the time, money, and safety costs of real-world design evaluations. Existing approaches either rely on active data collection or access to large, labeled datasets of past experiments, making them impractical in many real-world scenarios. In this work, we address the more challenging yet realistic setting of few-shot experimental design, where only a few labeled data points of input designs and their corresponding values are available. We approach this problem as a conditional generation task, where a model conditions on a few labeled examples and the desired output to generate an optimal input design. To this end, we present Pretrained Transformers for Experimental Design (ExPT), which uses a novel combination of synthetic pretraining with in-context learning to enable few-shot generalization. In ExPT, we only assume knowledge of a finite collection of unlabelled data points from the input domain and pretrain a transformer neural network to optimize diverse synthetic functions defined over this domain. Unsupervised pretraining allows ExPT to adapt to any design task at test time in an in-context fashion by conditioning on a few labeled data points from the target task and generating the candidate optima. We evaluate ExPT on few-shot experimental design in challenging domains and demonstrate its superior generality and performance compared to existing methods.
Submission Number: 71