A Robust Initialization of Residual Blocks for Effective ResNet Training without Batch NormalizationDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Normalization-Free ResNets, Weights Initialization, Exploding Gradient, Residual Blocks
Abstract: Batch Normalization is an essential component of all state-of-the-art neural networks architectures. However, it introduces many practical issues so that in recent years a lot of research has been devoted to designing normalization-free architectures. In this paper, we show that weights initialization is key to train ResNet-like normalization-free networks. In particular, we propose a slight modification of the summation operation of a block output with the skip-connection branch so that the whole network is correctly initialized. We show that this modified architecture achieves competitive results on CIFAR-10 without additional regularization or algorithmic modifications.
One-sentence Summary: We propose a suitable initialization strategy to train Normalization-Free Residual Networks without additional tricks.
Supplementary Material: zip
12 Replies

Loading