Just Leaf It: Accelerating Diffusion Classifiers with Hierarchical Class Pruning

05 Sept 2025 (modified: 17 Jan 2026)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Diffusion Models, Hierarchical Classification, Efficient Inference, Pruning
TL;DR: Diffusion Classifier (HDC) is a method that accelerates diffusion-based classification by up to 60% by using a label hierarchy to quickly prune irrelevant categories, thereby reducing computational cost while maintaining or even improving accuracy.
Abstract: Diffusion models, best known for high-fidelity image generation, have recently been repurposed as zero-shot classifiers by applying Bayes’ theorem. This approach avoids retraining but requires evaluating every possible label for each input, making inference prohibitively expensive on large label sets. We address this bottleneck with the Hierarchical Diffusion Classifier (HDC), a training-free method that exploits semantic label hierarchies to prune irrelevant branches early and refine predictions only within promising subtrees. This coarse-to-fine strategy reduces the number of expensive denoiser evaluations, yielding substantial efficiency gains. On ImageNet-1K, HDC achieves up to 60% faster inference while preserving, and in some cases even improving, accuracy (65.16% vs. 64.90%). Beyond ImageNet, we demonstrate that HDC generalizes to datasets without predefined ontologies by constructing hierarchies with large language models. Our results show that hierarchy-aware pruning provides a tunable trade-off between speed and precision, making diffusion classifiers more practical for large-scale and open-set applications.
Primary Area: generative models
Supplementary Material: pdf
Submission Number: 2364
Loading