Efficient Implementation of Stochastic Proximal Point Algorithm for Matrix and Tensor CompletionDownload PDFOpen Website

2021 (modified: 24 Apr 2023)EUSIPCO 2021Readers: Everyone
Abstract: We propose an efficient implementation of the stochastic proximal point algorithm (SPPA) for large-scale nonlinear least squares problems. SPPA has been shown to converge faster and more stable than the celebrated stochastic gradient descent (SGD) algorithm, and its many variations. However, the per-iteration update of SPPA itself is defined to be an optimization problem and has long been considered expensive. In this paper, we show that for nonlinear least squares problems, each iteration of SPPA can be carried out efficiently. Using Gauss-Newton along with the help of the kernel trick, we get an efficient implementation of the SPPA updates with the same order of complexity as SGD. The result is encouraging that it admits more flexible choices of the step sizes under similar assumptions. The proposed algorithm is elaborated for the problem of matrix and tensor completion. Real data experiments showcase its effectiveness in terms of convergence compared to SGD and its variants.
0 Replies

Loading