In or Out? Fixing ImageNet Out-of-Distribution Detection EvaluationDownload PDF

Published: 16 Apr 2023, Last Modified: 29 Apr 2024RTML Workshop 2023Readers: Everyone
Keywords: OOD detection, Out-of-distribution detection, ImageNet
TL;DR: We show that the currently used datasets for Out-of-distribution detection evaluation on ImageNet-1K are severly flawed and provide NINCO, a clean and challenging new OOD test dataset.
Abstract: Out-of-distribution (OOD) detection is the problem of identifying inputs which are unrelated to the in-distribution task. The OOD detection performance when the in-distribution (ID) is ImageNet-1Ki s commonly being tested on a small range of test OOD datasets. We find that most of the currently used test OOD datasets have severe issues, in some cases more than 50$\%$ of the dataset contains objects belonging to one of the ID classes. These erroneous samples heavily distort the evaluation of OOD detectors. As a solution, we introduce with NINCO a novel test OOD dataset, each sample checked to be ID free, which with its fine-grained range of OOD classes allows for a detailed analysis of an OOD detector's strengths and failure modes, particularly when paired with a number of synthetic “OOD unit-tests”. We provide detailed evaluations across a large set of architectures and OOD detection methods on NINCO and the unit-tests, revealing new insights about model weaknesses and the effects of pretraining on OOD detection performance.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 5 code implementations](https://www.catalyzex.com/paper/arxiv:2306.00826/code)
0 Replies

Loading