Keywords: penalized regression, screening rules, karush–kuhn–tucker, lasso, high-dimensional, sparse-group, feature reduction
TL;DR: Feature reduction approach for the sparse-group lasso and adaptive sparse-group lasso using strong screening rules for both variables and groups. The two layers of screening allow the fitting time of the models to be greatly reduced.
Abstract: The sparse-group lasso (SGL) performs both variable and group selection. It has found widespread use in many fields, due to its sparse-group penalty, which allows it to utilize grouping information and shrink inactive variables in active groups. However, SGL can be computationally expensive, due to the added shrinkage complexity. This paper introduces a feature reduction approach for SGL and the adaptive SGL, Dual Feature Reduction (DFR), which applies strong screening rules to reduce the input space before optimization. DFR applies two layers of screening and is based on dual norms. Through synthetic and real numerical studies, it is shown that DFR is the state-of-the-art screening rule for SGL by drastically reducing the computational cost under many different scenarios, outperforming other existing methods.
Submission Number: 19
Loading