Keywords: Grouped Dirichlet Diffusion; Probability Simplex; Hierarchical Structured Generative Modeling
Abstract: We present Grouped Dirichlet Diffusion (GDD), a novel generative model that employs the Grouped Dirichlet distribution to facilitate hierarchical and structured diffusion processes for high-dimensional bounded probability vectors, such as multichannel images. Unlike conventional diffusion methods that rely on Gaussian noise, GDD partitions data into meaningful feature groups (e.g., color channels in images) to preserve intra-group dependencies while allowing adaptive inter-group interactions over diffusion timesteps. Our theoretical framework ensures that both the forward marginals and reverse conditionals remain within the Grouped Dirichlet family, enabling closed-form transitions through multiplicative noise scheduling. This approach not only simplifies training dynamics but also guarantees numerical stability during sampling. Additionally, we replace the traditional evidence lower bound (ELBO) with a loss function based on the Kullback-Leibler divergence. Experimental evaluations validate the feasibility of GDD, with quantitative metrics demonstrating superior image generation performance compared to traditional diffusion models and several contemporary image generation methods.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 10689
Loading