Impact of Noise on Calibration and Generalisation of Neural Networks

ICML 2023 Workshop SCIS Submission8 Authors

Published: 20 Jun 2023, Last Modified: 28 Jul 2023SCIS 2023 PosterEveryoneRevisions
Keywords: Neural networks, noise perturbation, calibration, uncertainty quantification
TL;DR: Investigation of how noise perturbations impact neural network calibration and generalisation, identifying which perturbations are helpful and when.
Abstract: Noise injection and data augmentation strategies have been effective for enhancing the generalisation and robustness of neural networks (NNs). Certain types of noise such as label smoothing and MixUp have also been shown to improve calibration. Since noise can be added in various stages of the NN's training, it motivates the question of when and where the noise is the most effective. We study a variety of noise types to determine how much they improve calibration and generalisation, and under what conditions. More specifically we evaluate various noise-injection strategies in both in-distribution (ID) and out-of-distribution (OOD) scenarios. The findings highlight that activation noise was the most transferable and effective in improving generalisation, while input augmentation noise was prominent in improving calibration on OOD but not necessarily ID data.
Submission Number: 8
Loading