Abstract: This paper develops a joint weighted $\displaystyle \normlone$- and $\displaystyle \normlzero$-norm (WL1L0) regularization method by leveraging proximal operators and translation mapping techniques to mitigate the bias introduced by the $\displaystyle \normlone$-norm in applications to high-dimensional data. A weighting parameter $\alpha$ is incorporated to control the influence of both regularizers. Our broadly applicable model is nonconvex and nonsmooth, but we show convergence for the alternating direction method of multipliers (ADMM) and the strictly contractive Peaceman–Rachford splitting method (SCPRSM). Moreover, we evaluate the effectiveness of our model on both simulated and real high-dimensional genomic datasets by comparing with adaptive versions of the least absolute shrinkage and selection operator (LASSO), elastic net (EN), smoothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP). The results show that WL1L0 outperforms the LASSO, EN, SCAD and MCP by consistently achieving the lowest mean squared error (MSE) across all datasets, indicating its superior ability to handling large high-dimensional data. Furthermore, the WL1L0-SCPRSM also achieves the sparsest solution.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: See responses to reviewers.
Assigned Action Editor: ~Yingbin_Liang1
Submission Number: 3029
Loading