MaskedKD: Efficient Distillation of Vision Transformers with Masked Images

18 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: vision transformer, training cost, token pruning, knowledge distillation, supervised distilliation
TL;DR: We propose a simple yet effective strategy to save a large amount of computations in distilling vision transformers.
Abstract: Knowledge distillation is an effective method for training lightweight models, but the cost of acquiring teacher supervisions on training samples is often significant. Such supervision cost can be overwhelmingly large when we distill from large-scale proprietary models, such as vision transformers (ViTs). We present MaskedKD, a simple yet effective strategy that can significantly reduce the teacher supervision cost, without sacrificing the student accuracy or requiring direct access to (potentially proprietary) teacher. Specifically, MaskedKD diminishes the cost of running teacher at inference by masking a fraction of image patch tokens fed to the teacher, and therefore skipping the computations required to process those patches. The mask locations are selected to prevent masking away the core features of an image that the student uses for prediction. This masking mechanism operates based on some attention score of the student, which is already computed during the student forward pass, and thus incurs almost no additional computation. Our experiments show that MaskedKD dramatically reduces the teacher supervision cost, saving up to 50% teacher FLOPs without student accuracy drop.
Supplementary Material: zip
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1332
Loading