Nonconvex Sparse Logistic Regression via Proximal Gradient DescentDownload PDFOpen Website

2018 (modified: 07 Nov 2022)ICASSP 2018Readers: Everyone
Abstract: In this work we propose to fit a sparse logistic regression model by a weakly convex regularized nonconvex optimization problem. The idea is based on the finding that a weakly convex function as an approximation of the ℓ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">0</sub> pseudo norm is able to better induce sparsity than the commonly used ℓ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub> norm. For a class of weakly convex sparsity inducing functions, despite the nonconvexity, the algorithm proposed to solve the problem is based on proximal gradient descent, which allows the use of convergence acceleration techniques and stochastic gradient. Then the general framework is applied to a specific weakly convex function, and the solution method is instantiated as an iterative firm-shrinkage algorithm, of which the effectiveness is demonstrated in numerical experiments.
0 Replies

Loading