Abstract: Network calibration aims to accurately estimate the level
of confidences, which is particularly important for employing deep neural networks in real-world systems. Recent approaches leverage mixup to calibrate the network’s predictions during training. However, they do not consider the
problem that mixtures of labels in mixup may not accurately
represent the actual distribution of augmented samples. In
this paper, we present RankMixup, a novel mixup-based
framework alleviating the problem of the mixture of labels
for network calibration. To this end, we propose to use
an ordinal ranking relationship between raw and mixupaugmented samples as an alternative supervisory signal to
the label mixtures for network calibration. We hypothesize
that the network should estimate a higher level of confidence for the raw samples than the augmented ones (Fig.1).
To implement this idea, we introduce a mixup-based ranking loss (MRL) that encourages lower confidences for augmented samples compared to raw ones, maintaining the
ranking relationship. We also propose to leverage the ranking relationship among multiple mixup-augmented samples
to further improve the calibration capability. Augmented
samples with larger mixing coefficients are expected to have
higher confidences and vice versa (Fig.1). That is, the order
of confidences should be aligned with that of mixing coefficients. To this end, we introduce a novel loss, M-NDCG, in
order to reduce the number of misaligned pairs of the coefficients and confidences. Extensive experimental results on
standard benchmarks for network calibration demonstrate
the effectiveness of RankMixup.
0 Replies
Loading