The Score-Difference Flow for Implicit Generative Modeling

Published: 18 Jul 2023, Last Modified: 17 Sept 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Implicit generative modeling (IGM) aims to produce samples of synthetic data matching the characteristics of a target data distribution. Recent work (e.g. score-matching networks, diffusion models) has approached the IGM problem from the perspective of pushing synthetic source data toward the target distribution via dynamical perturbations or flows in the ambient space. In this direction, we present the score difference (SD) between arbitrary target and source distributions as a flow that optimally reduces the Kullback-Leibler divergence between them while also solving the Schrödinger bridge problem. We apply the SD flow to convenient proxy distributions, which are aligned if and only if the original distributions are aligned. We demonstrate the formal equivalence of this formulation to denoising diffusion models under certain conditions. We also show that the training of generative adversarial networks includes a hidden data-optimization sub-problem, which induces the SD flow under certain choices of loss function when the discriminator is optimal. As a result, the SD flow provides a theoretical link between model classes that individually address the three challenges of the "generative modeling trilemma"—high sample quality, mode coverage, and fast sampling—thereby setting the stage for a unified approach.
Submission Length: Long submission (more than 12 pages of main content)
Supplementary Material: pdf
Changes Since Last Submission: This is the camera-ready revision of the manuscript, which incorporates several changes requested by the action editor as well as several other general and cosmetic edits. (Please see our response to the recommendation below for more details.)
Assigned Action Editor: ~Tom_Rainforth1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1012
Loading