Function changes in the Backpropagation equation is equivalent to an implicit learning rateDownload PDF

17 May 2019 (modified: 05 May 2023)Submitted to ICML Deep Phenomena 2019Readers: Everyone
Keywords: Learning Rate, Backpropagation, Credit Assignment, Feedback Alignment
TL;DR: We demonstrate that function changes in the backpropagation is equivalent to an implicit learning rate
Abstract: The backpropagation algorithm is the de-facto standard for credit assignment in artificial neural networks due to its empirical results. Since its conception, variants of the backpropagation algorithm have emerged. More specifically, variants that leverage function changes in the backpropagation equations to satisfy their specific requirements. Feedback Alignment is one such example, which replaces the weight transpose matrix in the backpropagation equations with a random matrix in search of a more biologically plausible credit assignment algorithm. In this work, we show that function changes in the backpropagation procedure is equivalent to adding an implicit learning rate to an artificial neural network. Furthermore, we learn activation function derivatives in the backpropagation equations to demonstrate early convergence in these artificial neural networks. Our work reports competitive performances with early convergence on MNIST and CIFAR10 on sufficiently large deep neural network architectures.
1 Reply

Loading