Random Bias Initialization Improving Binary Neural Network TrainingDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: Improve saturating activations (sigmoid, tanh, htanh etc.) and Binarized Neural Network with Bias Initialization
Abstract: Edge intelligence especially binary neural network (BNN) has attracted considerable attention of the artificial intelligence community recently. BNNs significantly reduce the computational cost, model size, and memory footprint. However, there is still a performance gap between the successful full-precision neural network with ReLU activation and BNNs. We argue that the accuracy drop of BNNs is due to their geometry. We analyze the behaviour of the full-precision neural network with ReLU activation and compare it with its binarized counterpart. This comparison suggests random bias initialization as a remedy to activation saturation in full-precision networks and leads us towards an improved BNN training. Our numerical experiments confirm our geometric intuition.
Keywords: Binarized Neural Network, Activation function, Initialization, Neural Network Acceleration
Original Pdf: pdf
4 Replies

Loading