Rotations on Latent Hyperspheres: a Geometry-Aware Guiding Framework for Diffusion Models

19 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Diffusion Models, Reinforcement Learning, Latent Space Optimization, Physics-Informed Machine Learning, High-Dimensional Geometry, Constraint-Aware Generation
Abstract: Diffusion models have emerged as a powerful tool across diverse domains. However, their purely data-driven nature can produce samples that deviate from domain-governing constraints. We introduce a plug-and-play, Reinforcement Learning framework that operates in the latent space of pre-trained diffusion models to optimize initial noise samples. Our approach, motivated by the near-spherical geometry of high-dimensional Gaussian distributions, employs a novel rotation-matrix-based scheme for efficient latent space exploration. This steers the model toward more feature-preserving outputs, guided by task-specific rewards computed on the final samples. We evaluate our method on three diffusion models: one trained on solutions of the Darcy Flow PDE, one on a synthetic dataset with complex structural features, and a text-conditioned one. Across all three settings, our framework yields significant improvements in sample quality, achieving a ${\\sim}25\\%$ relative reduction in PDE residual, up to a ${\\sim}44\\%$ relative improvement on the synthetic dataset's feature-alignment metric, and up to a ${\\sim}80\\%$ relative improvement on human preference, compared to the vanilla diffusion models. Finally, we show that rotation-matrix-based exploration significantly outperforms unconstrained exploration, validating our geometry-aware approach and establishing a more effective method for latent space control.
Primary Area: generative models
Submission Number: 21692
Loading