Abstract: In many critical Machine Learning applications, such as autonomous driving
and medical image diagnosis, the detection of out-of-distribution (OOD) sam-
ples is as crucial as accurately classifying in-distribution (ID) inputs. Recently
Outlier Exposure (OE) based methods have shown promising results in detect-
ing OOD inputs via model fine-tuning with auxiliary outlier data. However,
most of the previous OE-based approaches emphasize more on synthesizing ex-
tra outlier samples or introducing regularization to diversify OOD sample space,
which is rather unquantifiable in practice. In this work, we propose a novel and
straightforward method called Margin bounded Confidence Scores (MaCS) to
address the nontrivial OOD detection problem by enlarging the disparity be-
tween ID and OOD scores, which in turn makes the decision boundary more
compact facilitating effective segregation with a simple threshold. Specifically,
we augment the learning objective of an OE regularized classifier with a sup-
plementary constraint, which penalizes high confidence scores for OOD inputs
compared to that of ID and significantly enhances the OOD detection perfor-
mance while maintaining the ID classification accuracy. Extensive experiments
on various benchmark datasets for image classification tasks demonstrate the
effectiveness of the proposed method by significantly outperforming state-of-
the-art (S.O.T.A) methods on various benchmarking metrics
Loading