Robust Hybrid Quantum-Classical Latent Diffusion Models via Quantum Noise

ICLR 2026 Conference Submission20775 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: latent diffusion process, quantum machine learning, quantum noise, implicit regularization
TL;DR: We propose a quantum latent diffusion process framework with quantum forward process and Gaussian reverse process, where the quantum noise implicitly regularize the training process for improved robustness and generalization.
Abstract: While quantum generative models offer computational advantages, quantum noise, unavoidable in real quantum hardware, is typically viewed as an obstacle to performance. We challenge this perspective by demonstrating that controlled quantum noise can be an implicit regularizer for latent diffusion models. We introduce a hybrid quantum-classical latent diffusion architecture where quantum error channels are injected between parameterized quantum circuit (PQC) layers in the latent space during the forward pass. We maintain classical backward pass to enable training efficiency on current quantum hardware. Theoretically, we prove these quantum channels control the model’s Lipschitz constant with depth and shrink the quantum Fisher information matrix (QFIM), yielding flatter minima within the loss landscape and hence tighter PAC-Bayesian generalization bounds. We conduct extensive experiments on MNIST and CIFAR-10, comparing our model with existing baselines. Our noise-regularized models improve the Frechet inception distance (FID) by 18.7% over noiseless baselines while maintaining superior robustness under adversarial attacks and parameter perturbations. To our knowledge, this work is among the first to systematically study transforming unavoidable quantum noise into a leverage for robust generative models.
Primary Area: generative models
Submission Number: 20775
Loading