Discrete-Continuous Variational Optimization with Local Gradients

Published: 10 Oct 2024, Last Modified: 12 Dec 2024NeurIPS 2024 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Variation methods, Black-box optimization, Multilevel optimization
TL;DR: We introduce a general method for incorporating local gradient information into Variational Optimization for black-box functions.
Abstract: Variational optimization (VO) offers a general approach for handling objectives which may involve discontinuities, or whose gradients are difficult to calculate. By introducing a variational distribution over the parameter space, such objectives are smoothed, and rendered amenable to VO methods. Local gradient information, though, may be available in certain problems, which is neglected by such an approach. We therefore consider a general method for incorporating local information via an augmented VO objective function to accelerate convergence and improve accuracy. We show how our augmented objective can be viewed as an instance of multilevel optimization. Finally, we show our method can train a genetic algorithm simulator, using a recursive Wasserstein distance objective.
Submission Number: 69
Loading