Robust Guided Diffusion for Offline Black-box Optimization

06 May 2024 (modified: 06 Nov 2024)Submitted to NeurIPS 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: offline model-based optimization, black-box optimization, diffusion models, score-based SDE, guided diffusion, classifier diffusion guidance, classifier-free diffusion guidance
TL;DR: We propose Robust Guided Diffusion for Offline Black-box Optimization (RGD), melding the advantages of proxy (explicit guidance) and proxy-free diffusion (robustness) for effective conditional generation.
Abstract: Offline black-box optimization aims to maximize a black-box function using an offline dataset of designs and their measured properties. Two main approaches have emerged: the forward approach, which learns a mapping from input to its value, thereby acting as a proxy to guide optimization, and the inverse approach, which learns a mapping from value to input for conditional generation. (a) Although proxy-free (classifier-free) diffusion shows promise in robustly modeling the inverse mapping, it lacks explicit guidance from proxies, essential for generating high-performance samples beyond the training distribution. Therefore, we propose proxy-enhanced sampling which utilizes the explicit guidance from a trained proxy to bolster proxy-free diffusion with enhanced sampling control. (b) Yet, the trained proxy is susceptible to out-of-distribution issues. To address this, we devise the module diffusion-based proxy refinement, which seamlessly integrates insights from proxy-free diffusion back into the proxy for refinement. To sum up, we propose Robust Guided Diffusion for Offline Black-box Optimization (RGD), combining proxy and proxy-free diffusion for effective conditional generation. Empirical evaluations on design-bench underscore the efficacy of RGD. Our code is here.
Primary Area: Machine learning for other sciences and fields
Submission Number: 2262
Loading