Ensemble Distillation for Out-of-distribution Detection

Published: 01 Jan 2023, Last Modified: 13 Nov 2024ICPADS 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Out-of-distribution detection is critical to a reliable application of deep neural networks. To reduce model uncertainty, we propose a simple yet effective approach of ensemble knowledge distillation. We blend ensemble model and knowledge distillation to improve model generalization on indomain recognition, thereby yielding accurate and robust out-of-distribution detection. The former effectively expands data recognition feature space, while the latter further regularizes the model through knowledge distillation, enhancing in-domain feature recognition. This effective integration successfully yields lower model uncertainty on in-domain feature recognition and improves anomaly detection in a more scalable manner. We justify our approach through extensive experiments on various benchmarks, demonstrating its significant improvement in out- of-distribution detection. We validate our approach with a variety of up-to-date DNNs, like Vision Transformer.
Loading