Keywords: Quality Diversity Optimization
TL;DR: We introduce Soft QD, a new formulation of quality-diversity optimization and introduce SQUAD, a derived algorithm that achieves state of the art performance with improved scalability to high-dimensional spaces.
Abstract: Quality-Diversity (QD) algorithms constitute a branch of optimization that is concerned with discovering a diverse and high-quality set of solutions to an optimization problem.
Current QD methods commonly maintain diversity by dividing the behavior space into discrete regions, ensuring that solutions are distributed across different parts of the space.
The QD problem is then solved by searching for the best solution in each region.
This approach to QD optimization poses challenges in large solution spaces, where storing many solutions is impractical, and in high-dimensional behavior spaces, where discretization becomes ineffective due to the curse of dimensionality.
We present an alternative framing of the QD problem, called \emph{Soft QD}, that sidesteps the need for discretizations.
We validate this formulation by demonstrating its desirable properties, such as monotonicity, and by relating its limiting behavior to the widely used QD Score metric.
Furthermore, we leverage it to derive a novel differentiable QD algorithm, \emph{Soft QD Using Approximated Diversity (SQUAD)}, and demonstrate empirically that it is competitive with current state of the art methods on standard benchmarks while offering better scalability to higher dimensional problems.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 22139
Loading