Bidirectional Beta-Tuned Diffusion Model

Tianyi Zheng, Jiayang Zou, Peng-Tao Jiang, Hao Zhang, Jinwei Chen, Jia Wang, Bo Li

Published: 01 Jan 2026, Last Modified: 04 Dec 2025IEEE Transactions on Pattern Analysis and Machine IntelligenceEveryoneRevisionsCC BY-SA 4.0
Abstract: Diffusion models have gained significant attention in the field of generative modeling due to their capability to produce high-quality samples. However, recent studies show that applying a uniform treatment to all distributions during the training of diffusion models is sub-optimal. In this paper, we present a comprehensive theoretical analysis of the forward process in diffusion models. Our findings indicate that distribution variations are not uniform throughout the diffusion process, with the sharpest changes occurring during the initial stages. Moreover, we observe that the initial distribution converges to a Gaussian distribution at an exponential rate, indicating that different initial distributions rapidly become quite similar during the forward diffusion process. Consequently, employing a uniform timestep sampling strategy does not effectively capture these dynamics, potentially leading to sub-optimal training outcomes for diffusion models. To remedy this, we introduce the Bidirectional Beta-Tuned Diffusion Model (BB-TDM). The BB-TDM leverages the Beta distribution to design the timestep sampling distribution and enhance the separation between different initial distributions during the diffusion process. By selecting appropriate parameters, the BB-TDM ensures that the timestep sampling distribution is aligned with the properties of the forward diffusion process and moderates the convergence speed of different initial distributions. Extensive experiments across various benchmark datasets on different diffusion models confirm the efficacy of the proposed BB-TDM.
Loading