Rethinking Reparameterization of Stochastic Processes in Generative Modeling
Keywords: diffusion, bridge, stochastic process, image enhancement
Abstract: Recent advances in diffusion-based image enhancement have motivated approaches that seek to reuse large pretrained diffusion models across different stochastic processes. In particular, it has been suggested that a diffusion backbone trained under one stochastic process can be repurposed to sample from alternative, task-specific processes through suitable reparameterizations, while relying on the pretrained model to estimate the underlying clean data. This perspective raises the question of whether stochastic processes can, in general, be transformed into one another without retraining the generative backbone. In this work, we study this question from a fundamental standpoint. We show that such stochastic process reparameterization is not possible. Specifically, we prove that under broad and commonly satisfied assumptions, any attempt to reparameterize a pretrained stochastic process backbone to represent a different stochastic process necessarily collapses to the original process itself. While reparameterizations may alter the parametrization of the sampling procedure, they cannot modify the underlying stochastic dynamics learned by the pretrained model. We support our theoretical results with empirical experiments demonstrating that reparameterization-based methods yield sampling behavior equivalent to standard diffusion with modified sampling method. Our findings imply that a generative backbone trained under a given stochastic process can be used to sample only from that process, and cannot be reused to represent fundamentally different stochastic dynamics through reparameterization alone.
Submission Number: 86
Loading