Local Linear Approximation Algorithm for Neural NetworkDownload PDF

Anonymous

19 Oct 2020 (modified: 05 May 2023)Submitted to NeurIPSW 2020: DL-IGReaders: Everyone
Keywords: Local linear approximation, layerwise optimized adaptive neural network, stochastic gradient descent algorithm
TL;DR: This work tackles the estimation of weights and biases in neural network from a different point of view and develops a reliable training algorithm in deep neural network.
Abstract: This paper is concerned with estimation of weights and biases in feed forward neural network (FNN). We propose using local linear approximation (LLA) for the activation function, and develop a LLA algorithm to estimate the weights and biases of one hidden layer FNN by iteratively linear regression. We further propose the layerwise optimized adaptive neural network (LOAN), in which we use the LLA to estimate the weights and biases in the LOAN layer by layer adaptively. We compare the performance of the LOAN with the commonly-used procedures in deep learning via analyses of four benchmark data sets. The numerical comparison implies that the proposed LOAN may outperform the existing procedures.
3 Replies

Loading