Your Discriminative Model is Secretly a Generative Model

ICLR 2026 Conference Submission10649 Authors

18 Sept 2025 (modified: 27 Nov 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural Tangent Kernel, Generative model
TL;DR: We show Discriminative models can serve as Generative model.
Abstract: Although discriminative and generative models are fundamentally equivalent in understanding data distributions, bridging these paradigms -- especially transforming off-the-shelf discriminative models into generative ones -- remains challenging. In this paper, we introduce a universal framework that unlocks the generative potential of any discriminative model by directly leveraging the data manifold encoded in its parameter space. Drawing inspiration from the score function used in diffusion models, which measures the distance between a sample and the data manifold in probability space, we generalize this concept to the functional domain. To achieve this, we introduce the Discriminative Score Function (DSF), which quantifies the functional distance between a sample and the data manifold by mapping both into a shared functional space using the Loss Tangent Kernel (LTK), a variant of the Neural Tangent Kernel. Our framework is architecture- and algorithm-agnostic, as evidenced by various architectures such as ViT, ResNet, and DETR on tasks including object detection, classification, and self-supervised learning (\eg, CLIP and DINO). Additionally, our approach extends to applications in image editing, inpainting, and explainable AI (XAI).Finally, we demonstrate the promising potential of DSF by adopting diffusion model techniques for enhanced generation quality.
Supplementary Material: pdf
Primary Area: interpretability and explainable AI
Submission Number: 10649
Loading