Stein Variational Evolution Strategies

Published: 07 May 2025, Last Modified: 13 Jun 2025UAI 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Zero-Order Methods, Approximate Inference, Optimization, Stein Variational Gradient Descent
TL;DR: We combine SVGD with evolution strategies to improve gradient-free inference and optimization.
Abstract: Efficient global optimization and sampling are fundamental challenges, particularly in fields such as robotics and reinforcement learning, where gradients may be unavailable or unreliable. In this context, jointly optimizing multiple solutions is a promising approach to avoid local optima. While Stein Variational Gradient Descent (SVGD) provides a powerful framework for sampling diverse solutions, its reliance on first-order information limits its applicability to differentiable objectives. Existing gradient-free SVGD variants often suffer from slow convergence, and poor scalability. To improve gradient-free sampling and optimization, we propose Stein Variational CMA-ES, a novel gradient-free SVGD-like method that combines the efficiency of evolution strategies with SVGD-based repulsion forces. We perform an extensive empirical evaluation across several domains, which shows that the integration of the ES update in SVGD significantly improves the performance on multiple challenging benchmark problems. Our findings establish SV-CMA-ES as a scalable method for zero-order sampling and blackbox optimization, bridging the gap between SVGD and evolution strategies.
Supplementary Material: zip
Latex Source Code: zip
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission740/Authors, auai.org/UAI/2025/Conference/Submission740/Reproducibility_Reviewers
Submission Number: 740
Loading