Improving Robustness with Adaptive Weight Decay

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Adaptive weight decay, adversarial robustness, weight decay, robust overfitting, overfitting, adversarial attacks, noisy label
TL;DR: We present Adaptive weight decay, a novel method for tuning the weight decay hyper-parameter on the fly during training. Settings that suffer from overfitting, such as adversarial training and noisy label settings, can benefit from AWD.
Abstract: We propose adaptive weight decay, which automatically tunes the hyper-parameter for weight decay during each training iteration. For classification problems, we propose changing the value of the weight decay hyper-parameter on the fly based on the strength of updates from the classification loss (i.e., gradient of cross-entropy), and the regularization loss (i.e., $\ell_2$-norm of the weights). We show that this simple modification can result in large improvements in adversarial robustness — an area which suffers from robust overfitting — without requiring extra data accros various datasets and architecture choices. For example, our reformulation results in 20\% relative robustness improvement for CIFAR-100, and 10\% relative robustness improvement on CIFAR-10 comparing to the best tuned hyper-parameters of traditional weight decay resulting in models that have comparable performance to SOTA robustness methods. In addition, this method has other desirable properties, such as less sensitivity to learning rate, and smaller weight norms, which the latter contributes to robustness to overfitting to label noise, and pruning.
Supplementary Material: pdf
Submission Number: 1656
Loading