Abstract: We introduce a framework for designing efficient diffusion models for $d$-dimensional symmetric-space Riemannian manifolds, including the torus, sphere, special orthogonal group and unitary group. Existing manifold diffusion models often depend on heat kernels, which lack closed-form expressions and require either $d$ gradient evaluations or exponential-in-$d$ arithmetic operations per training step. We introduce a new diffusion model for symmetric manifolds with a spatially-varying covariance, allowing us to leverage a projection of Euclidean Brownian motion to bypass heat kernel computations. Our training algorithm minimizes a novel efficient objective derived via Ito's Lemma, allowing each step to run in $O(1)$ gradient evaluations and nearly-linear-in-$d$ ($O(d^{1.19})$) *arithmetic* operations, reducing the gap between diffusions on symmetric manifolds and Euclidean space. Manifold symmetries ensure the diffusion satisfies an "average-case" Lipschitz condition, enabling accurate and efficient sample generation. Empirically, our model outperforms prior methods in training speed and improves sample quality on synthetic datasets on the torus, special orthogonal group, and unitary group.
Lay Summary: Diffusion models have recently achieved remarkable success in generating synthetic data, such as realistic images, audio, and video. These models work well when data lives in flat, Euclidean space. However, in many scientific and engineering applications—such as molecular drug discovery, quantum physics, and robotics—data naturally lies on curved, non-Euclidean spaces known as manifolds. Training diffusion models on these spaces is often computationally expensive, requiring either many gradient computations or exponentially large runtimes in the data dimension.
In this paper, we develop a new type of diffusion model that is efficient to train and sample from on a broad class of non-Euclidean spaces called symmetric manifolds, including spheres, tori, and the special orthogonal and unitary groups. Our key idea is to design a diffusion process that incorporates a curvature-aware covariance term. This allows us to simulate the diffusion by projecting simple Euclidean noise onto the manifold, significantly reducing computational cost.
As a result, each step of our training algorithm requires only a constant number of gradient evaluations and a number of arithmetic operations nearly-linear in the data dimension, narrowing the performance gap between manifold-based and Euclidean diffusion models. We also prove that our model satisfies a probabilistic smoothness condition that guarantees accurate and stable sample generation.
Experiments on synthetic datasets show that our method trains faster and produces higher-quality samples compared to previous approaches, across a variety of manifolds commonly used in scientific applications.
Link To Code: https://github.com/mangoubi/Efficient-Diffusion-Models-for-Symmetric-Manifolds
Primary Area: Theory->Probabilistic Methods
Keywords: Diffusion Models, Symmetric Manifolds, Random Matrix Theory, Score-Based Generative Models, Efficient Sampling
Submission Number: 3938
Loading