Keywords: Quantum Machine Learning, Knowledge Distillation, Variational Quantum Circuits, Quantum Soft Labels, Hybrid Models, QFT-inspired Encoding.
TL;DR: We use variational quantum circuits as teachers to generate soft labels for knowledge distillation, showing that classical students trained on these quantum signals achieve strong accuracy and robustness—even when the quantum teacher itself is weak.
Abstract: Quantum machine learning offers a path to leverage near-term quantum devices for tasks that remain challenging for classical models. We introduce a quantum–classical hybrid knowledge distillation framework in which variational quantum circuits, equipped with angle and Quantum Fourier Transform-inspired encodings, serve as teachers that generate expressive soft-label distributions. These signals are distilled into lightweight classical students via a hybrid loss that blends hard and soft supervision. On MNIST and CIFAR-10, students distilled from quantum teachers achieve stronger robustness to Gaussian noise and rotations than classical baselines, while retaining high clean accuracy and calibration. Crucially, this shows that even capacity-limited NISQ models can provide valuable supervisory signals, suggesting a practical route toward quantum-enhanced learning without requiring quantum inference at deployment.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 12423
Loading