Skip-connection and batch-normalization improve data separation abilityDownload PDF

20 Mar 2019 (modified: 05 May 2023)Submitted to LLD 2019Readers: Everyone
Keywords: Deep learning, ResNet, Skip-connection, Batch-normalization
TL;DR: The Skip-connection in ResNet and the batch-normalization improve the data separation ability and help to train a deep neural network.
Abstract: The ResNet and the batch-normalization (BN) achieved high performance even when only a few labeled data are available. However, the reasons for its high performance are unclear. To clear the reasons, we analyzed the effect of the skip-connection in ResNet and the BN on the data separation ability, which is an important ability for the classification problem. Our results show that, in the multilayer perceptron with randomly initialized weights, the angle between two input vectors converges to zero in an exponential order of its depth, that the skip-connection makes this exponential decrease into a sub-exponential decrease, and that the BN relaxes this sub-exponential decrease into a reciprocal decrease. Moreover, our analysis shows that the preservation of the angle at initialization encourages trained neural networks to separate points from different classes. These imply that the skip-connection and the BN improve the data separation ability and achieve high performance even when only a few labeled data are available.
3 Replies

Loading