Dynamic Sasvi: Strong Safe Screening for Norm-Regularized Least SquaresDownload PDF

21 May 2021, 20:42 (modified: 23 Jan 2022, 07:36)NeurIPS 2021 PosterReaders: Everyone
Keywords: convex optimization, Lasso, safe screening
Abstract: A recently introduced technique, called ``safe screening,'' for a sparse optimization problem allows us to identify irrelevant variables in the early stages of optimization. In this paper, we first propose a flexible framework for safe screening based on the Fenchel--Rockafellar duality and then derive a strong safe screening rule for norm-regularized least squares using the proposed framework. We refer to the proposed screening rule for norm-regularized least squares as ``dynamic Sasvi'' because it can be interpreted as a generalization of Sasvi. Unlike the original Sasvi, it does not require the exact solution of a more strongly regularized problem; hence, it works safely in practice. We show that our screening rule always eliminates more features compared with the existing state-of-the-art methods.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
12 Replies

Loading