Keywords: label smoothing, knowledge distillation, image classification, neural machine translation, binary neural networks
Abstract: This work aims to empirically clarify a recently discovered perspective that label smoothing is incompatible with knowledge distillation. We begin by introducing the motivation behind on how this incompatibility is raised, i.e., label smoothing erases relative information between teacher logits. We provide a novel connection on how label smoothing affects distributions of semantically similar and dissimilar classes. Then we propose a metric to quantitatively measure the degree of erased information in sample's representation. After that, we study its one-sidedness and imperfection of the incompatibility view through massive analyses, visualizations and comprehensive experiments on Image Classification, Binary Networks, and Neural Machine Translation. Finally, we broadly discuss several circumstances wherein label smoothing will indeed lose its effectiveness.
One-sentence Summary: This work empirically clarifies a recently discovered perspective that label smoothing is incompatible with knowledge distillation. Project page: http://zhiqiangshen.com/projects/LS_and_KD/index.html.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Data: [CUB-200-2011](https://paperswithcode.com/dataset/cub-200-2011), [ImageNet](https://paperswithcode.com/dataset/imagenet), [ImageNet-LT](https://paperswithcode.com/dataset/imagenet-lt), [Places](https://paperswithcode.com/dataset/places), [iNaturalist](https://paperswithcode.com/dataset/inaturalist)