K-Gen: Unlocking High-Resolution Data-Free Knowledge Distillation via Key Region Generation

ICLR 2026 Conference Submission20289 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Data-free Knowledge Distillation, Large-scale Datasets, Knowledge Distillation
TL;DR: We propose K-Gen, a data-free knowledge distillation method that generates low-res key regions using class-activation scores and multi-resolution diversity, achieving strong results on both low and high-resolution datasets.
Abstract: Data-Free Knowledge Distillation (DFKD) is an advanced technique that enables knowledge transfer from a teacher model to a student model without relying on original training data. While DFKD methods have achieved success on smaller datasets like CIFAR10 and CIFAR100, they encounter challenges on larger, high-resolution datasets such as ImageNet. A primary issue with previous approaches is their generation of synthetic images at high resolutions (e.g., $224 \times 224$) without leveraging information from real images, often resulting in noisy images that lack essential class-specific features in large datasets. Additionally, the computational cost of generating the extensive data needed for effective knowledge transfer can be prohibitive. In this paper, we introduce \underline{K}ey Region Data-free \underline{Gen}eration (K-Gen) to address these limitations. K-Gen generates only key region of images at lower resolutions while using class-activation score to ensure that the generated images retain critical, class-specific features. To further enhance model diversity, we propose multi-resolution generation and embedding diversity techniques that strengthen latent space representations, leading to significant performance improvements. Experimental results demonstrate that K-Gen achieves state-of-the-art performance across both small-, high- and mega-resolution datasets, with notable performance gains of up to two digits in nearly all ImageNet and subset experiments. Code is available at \url{https://anonymous.4open.science/r/K-Gen-DFKD}.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 20289
Loading