Abstract: Miscalibration of a model is defined as the mismatch between predicting probability estimates and the true correctness likelihood. In this work we aim to replicate the results reported by  on their analysis of the effect of Mixup  on a network’s calibration. Mixup is an effective yet simple approach of data augmentation which generates a convex combination of a pair of training images and their corresponding labels as the input and target for training a network. We replicate the results reported by the authors for CIFAR-100, Fashion-MNIST, STL-10, out-of-distribution and random noise data.
NeurIPS Paper Id: https://openreview.net/forum?id=rJgxnSHg8r