Born-Again Neural NetworksDownload PDFOpen Website

2018 (modified: 11 Nov 2022)ICML 2018Readers: Everyone
Abstract: Knowledge Distillation (KD) consists of transferring “knowledge” from one machine learning model (the teacher) to another (the student). Commonly, the teacher is a high-capacity model with formidab...
0 Replies

Loading