Diffusion model conditioning on Gaussian mixture model and negative Gaussian mixture gradient

Published: 2025, Last Modified: 14 May 2025Neurocomputing 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•A diffusion model conditioning on Gaussian mixture model is proposed.•Latent distributions built by features are proven better than by classes.•A new negative Gaussian mixture gradient is integrated into our diffusion model.•Our new gradient offers benefits similar to Wasserstein distance.•Combining the gradient with entropy outperforms binary cross entropy in training.
Loading