Abstract: Out-of-distribution (OOD) detection is essential for the safe deployment of machine learning models. Extensive work has focused on devising various scoring functions for de- tecting OOD samples, while only a few studies focus on training neural networks using certain model calibration objectives, which often lead to a compromise in predictive accuracy and support only limited choices of scoring func- tions. In this work, we first identify the feature collapse phenomena in Logit Normalization (LogitNorm), then pro- pose a novel hyperparameter-free formulation that signif- icantly benefits a wide range of post-hoc detection meth- ods. To be specific, we devise a feature distance-awareness loss term in addition to LogitNorm, termed ELogitNorm, which enables improved OOD detection and in-distribution (ID) confidence calibration. Extensive experiments across standard benchmarks demonstrate that our approach out- performs state-of-the-art training-time methods in OOD de- tection while maintaining strong ID classification accuracy. Our code is available on: https : / / github . com / limchaos/ElogitNorm.
Loading