ShakeDrop regularizationDownload PDF

12 Feb 2018 (modified: 14 Oct 2024)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: This paper proposes a powerful regularization method named ShakeDrop regularization. ShakeDrop is inspired by Shake-Shake regularization that decreases error rates by disturbing learning. While Shake-Shake can be applied to only ResNeXt which has multiple branches, ShakeDrop can be applied to not only ResNeXt but also ResNet, and PyramidNet in a memory efficient way. Important and interesting feature of ShakeDrop is that it strongly disturbs learning by multiplying even a negative factor to the output of a convolutional layer in the forward training pass. ShakeDrop outperformed state-of-the-arts on CIFAR-10/100. The full version of the paper including other experiments is available at https://arxiv.org/abs/1802.02375.
Data: [CIFAR-10](https://paperswithcode.com/dataset/cifar-10), [CIFAR-100](https://paperswithcode.com/dataset/cifar-100)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 6 code implementations](https://www.catalyzex.com/paper/shakedrop-regularization/code)
5 Replies

Loading