Keywords: Multi-objective optimization, Denoising Diffusion Probabilistic Models, Multiple gradient descent, Offline multi‑objective optimization, Multi-objective Bayesian optimization, Diffusion Transformer
TL;DR: We propose SPREAD, a diffusion-based generative framework for multi-objective optimization that couples adaptive gradient updates with a repulsion mechanism, achieving competitive efficiency, scalability, and Pareto front coverage across benchmarks.
Abstract: Developing efficient multi-objective optimization methods to compute the Pareto set of optimal compromises between conflicting objectives remains a key challenge, especially for large-scale and expensive problems. To bridge this gap, we introduce SPREAD, a generative framework based on Denoising Diffusion Probabilistic Models (DDPMs). SPREAD first learns a conditional diffusion process over points sampled from the decision space and then, at each reverse diffusion step, refines candidates via a sampling scheme that uses an adaptive multiple gradient descent-inspired update for fast convergence alongside a Gaussian RBF–based repulsion term for diversity. Empirical results on multi-objective optimization benchmarks, including offline and Bayesian surrogate-based settings, show that SPREAD matches or exceeds leading baselines in efficiency, scalability, and Pareto front coverage.
Primary Area: generative models
Submission Number: 16892
Loading