Provable Sample-Efficient Transfer Learning Conditional Diffusion Models via Representation Learning

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: conditional diffusion model, transfer learning, sample complexity
TL;DR: We provide the first theoretical sample-efficiency results of conditional diffusion models via transfer learning
Abstract: While conditional diffusion models have achieved remarkable success in various applications, they require abundant data to train from scratch, which is often infeasible in practice. To address this issue, transfer learning has emerged as an essential paradigm in small data regimes. Despite its empirical success, the theoretical underpinnings of transfer learning conditional diffusion models remain unexplored. In this paper, we take the first step towards understanding the sample efficiency of transfer learning conditional diffusion models through the lens of representation learning. Inspired by practical training procedures, we assume that there exists a low-dimensional representation of conditions shared across all tasks. Our analysis shows that with a well-learned representation from source tasks, the sample complexity of target tasks can be reduced substantially. Numerical experiments are also conducted to verify our results.
Primary Area: Theory (e.g., control theory, learning theory, algorithmic game theory)
Submission Number: 17351
Loading