Spurious Correlations in Diffusion Models and How to Fix Them

Published: 06 Mar 2025, Last Modified: 06 Mar 2025SCSL @ ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Track: regular paper (up to 6 pages)
Keywords: Diffusion models, Conditional Independence
TL;DR: Generative models also suffer from spurious correlations due to dependency between attributes. We propose a method to break these dependencies.
Abstract: Generative models are not immune to spurious correlations. The spuriousness in generative models is defined by their ability to compose attributes faithfully, often referred to as compositionality in generative models. To compose attributes successfully, a model should learn to accurately capture the statistical independence between attributes. This paper shows that standard conditional diffusion models violate this assumption, even when all attribute compositions are observed during training. And, this violation is significantly more severe when only a subset of the compositions is observed. We propose CoInD to address this problem. It explicitly enforces statistical independence between the conditional marginal distributions by minimizing Fisher’s divergence between the joint and marginal distributions. The theoretical advantages of CoInD are reflected in both qualitative and quantitative experiments, demonstrating a significantly more faithful and precise controlled generation of samples for arbitrary compositions of attributes
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Format: Yes, the presenting author will definitely attend in person because they attending ICLR for other complementary reasons.
Funding: Yes, the presenting author of this submission falls under ICLR’s funding aims, and funding would significantly impact their ability to attend the workshop in person.
Presenter: ~Sachit_Gaudi1
Submission Number: 27
Loading