Zero-Shot Knowledge Distillation in Deep NetworksDownload PDFOpen Website

2019 (modified: 11 Nov 2022)ICML 2019Readers: Everyone
Abstract: Knowledge distillation deals with the problem of training a smaller model (<em>Student</em>) from a high capacity source model (<em>Teacher</em>) so as to retain most of its performance. Existing a...
0 Replies

Loading