Accelerating Diffusion-based Combinatorial Optimization Solvers by Progressive Distillation

Published: 20 Jun 2023, Last Modified: 11 Oct 2023SODS 2023 OralEveryoneRevisionsBibTeX
Keywords: combinatorial optimization, diffusion models, progressive distillation
TL;DR: This paper uses progressive distillation to speed up the inference of diffusion based combinatorial optimization solvers.
Abstract: Graph-based diffusion models have shown promising results in terms of generating high-quality solutions to NP-complete (NPC) combinatorial optimization (CO) problems. However, those models are often inefficient in inference, due to the iterative evaluation nature of the denoising diffusion process. This paper proposes to use $\textit {progressive}$ distillation to speed up the inference by taking fewer steps (e.g., forecasting two steps ahead within a single step) during the denoising process. Our experimental results show that the progressively distilled model can perform inference $\textbf{16}$ times faster with only $\textbf{0.019}$% degradation in performance on the TSP-50 dataset.
Submission Number: 32
Loading