Abstract: Average- K classification is an alternative to top- K clas-sification in which the number of labels returned varies with the ambiguity of the input image but must average to K over all the samples. A simple method to solve this task is to threshold the softmax output of a model trained with the cross-entropy loss. This approach is theoretically proven to be asymptotically consistent, but it is not guaranteed to be optimal for a finite set of samples. In this paper, we pro-pose a new loss function based on a multi-label classification head in addition to the classical softmax. This second head is trained using pseudo-labels generated by thresholding the softmax head while guaranteeing that K classes are returned on average. We show that this approach allows the model to better capture ambiguities between classes and, as a result, to return more consistent sets of possible classes. Experiments on two datasets from the literature demon-strate that our approach outperforms the softmax baseline, as well as several other loss functions more generally de-signed for weakly supervised multi-label classification. The gains are larger the higher the uncertainty, especially for classes with few samples.
External IDs:dblp:conf/wacv/GarcinSJS25
Loading