A General Framework for Sparsity Regularized Feature Selection via Iteratively Reweighted Least Square Minimization
Abstract: A variety of feature selection methods based on sparsity regularization have been developed with different loss functions and sparse regularization functions. Capitalizing on the existing sparsity regularized feature selection methods, we propose a general sparsity feature selection (GSR-FS) algorithm that optimizes a l 2, r (0 <Ā r ⤠2) based loss function with a l 2, p -norm (0 < p ⤠2) sparse regularization. The l 2, r - norm (0 < š ⤠2) based loss function brings flexibility to balance data-fitting and robustness to outliers by tuning its parameter, and the l 2, p -norm (0 < p ⤠1) based regularization function is able to boost the sparsity for feature selection. To solve the optimization problem with multiple non-smooth and non-convex functions when , we develop an efficient solver under the general umbrella of Iterative Reweighted Least Square (IRLS) algorithms. Our algorithm has been proved to converge with a theoretical convergence order of min(2 ā r, 2 ā p ) at least . The experimental results have demonstrated that our method could achieve competitive feature selection performance on publicly available datasets compared with state-of-the-art feature selection methods, with reduced computational cost.
0 Replies
Loading