Consistency models with learned idempotent boundary conditions

Published: 17 Jun 2024, Last Modified: 12 Jul 2024ICML 2024 Workshop GRaMEveryoneRevisionsBibTeXCC BY 4.0
Track: Extended abstract
Keywords: Consistency Models, Generative Models
Abstract: Consistency Models have recently emerged as an alternative to Diffusion Models, being capable of generating high-quality data while keeping the computational cost low, often requiring only one or two network evaluations. At their core, Consistency Models learn to approximate an ODE flow, and their architecture is constrained to respect the boundary conditions of such ODE. In this work, we propose a novel method to train Consistency Models by learning the boundary conditions, resulting in a model that acts as an identity only for inputs that are on the support of the data, becoming an idempotent function. We compare our method with Consistency Models on simple tabular and image benchmarks, showing competitive sample quality and confirming the potential of the introduced training technique.
Submission Number: 45
Loading