Generalizing Cross Entropy Loss with a Beta Proper Composite Loss: An Improved Loss Function for Open Set RecognitionDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Proper Composite Loss, Open Set Recognition in deep learning, Out-of-distribution detection in deep learning
Abstract: Open set recognition involves identifying data instances encountered during test time that do not belong to known classes in the training set. The majority of recent deep learning approaches to open set recognition use a cross entropy loss to train their networks. Surprisingly, other loss functions are seldom used. In our work, we explore generalizing cross entropy with a Beta loss. This Beta loss is a proper composite loss with a Beta weight function. This weight function adds the flexibility of putting more emphasis on different parts of the observation-conditioned class probability (i.e. $P(Y|X)$) range during training. We show that the flexibility gained through this is Beta loss function produces consistent improvements over cross entropy loss for open set recognition and produces state of the art results relative to recent methods.
One-sentence Summary: We achieve state of the art results on Open Set Recognition by replacing Log Loss with a Beta Proper Composite Loss
6 Replies

Loading