Contrastive Embedding of Structured Space for Bayesian OptimizationDownload PDF

30 Sept 2021, 15:22 (modified: 10 Dec 2021, 19:30)NeurIPS 2021 Workshop MetaLearn PosterReaders: Everyone
Keywords: Contrastive learning, Bayesian Optimization, Meta Learning, Context-free Grammar
Abstract: Bayesian optimisation (BO) has been used to search in structured spaces described by a context-free grammar, such as chemical molecules. Previous work has used a probabilistic generative model, such as a variational autoencoder, to learn a mapping for the structured representations into a compact continuous embedding within which BO can take advantage of local proximity and identify good search areas. However, the resultant embedding does not fully capture the structural proximity relations of the input space, which leads to inefficient search. In this paper, we propose to use contrastive learning to learn an alternative embedding. We outline how a subtree replacement strategy can generate structurally similar pairs of samples from the input space for use in contrastive learning. We demonstrate that the resulting embedding captures more of the structural proximity relationships of the input space and improves BO performance when applied to a synthetic arithmetic expression fitting task and a real-world molecule optimisation task.
Contribution Process Agreement: Yes
Author Revision Details: We would like to thank the workshop committee and reviewers for all of their great feedback. We have addressed most comments. This hopefully makes our work clearer and more appropriate for the meta-learning workshop.
Poster Session Selection: Poster session #2 (16:50 UTC+1)
0 Replies