Averager Student: Distillation from Undistillable TeacherDownload PDF

01 Mar 2023 (modified: 29 May 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: knowledge distillation, model IP protection, model stealing
TL;DR: We propose a novel method which improves the distillation from an undistillable teacher.
Abstract: Today, some companies release their black-box model as a service for users, where users can see the model’s output corresponding to their input. However, these models can be stolen via knowledge distillation by malicious users. Recently, undistillable teacher (Ma et al., 2021) is introduced in order to prevent the knowledge leakage. In this study, with the aim of contributing to solutions for model intellectual property (IP) protection, we propose a novel method which improves the distillation from an undistillable teacher whose goal is make the distillation difficult for students, with the purpose of model protection. The codes are released at https://github.com/rkevser/AveragerStudent.
6 Replies

Loading