Ensemble Feature Selection With Block-Regularized m × 2 Cross-ValidationDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 06 Oct 2023IEEE Trans. Neural Networks Learn. Syst. 2023Readers: Everyone
Abstract: Ensemble feature selection (EFS) has attracted significant interest in the literature due to its great potential in reducing the discovery rate of noise features and stabilizing the feature selection results. In view of the superior performance of block-regularized <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$m \,\times \,2$ </tex-math></inline-formula> cross-validation on generalization performance and algorithm comparison, a novel EFS technology based on block-regularized <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$m \,\times \, 2$ </tex-math></inline-formula> cross-validation is proposed in this study. Contrary to the traditional ensemble learning with a binomial distribution, the distribution of feature selection frequency in the proposed technique is approximated by a beta distribution more accurately. Furthermore, theoretical analysis of the proposed technique shows that it yields a higher selection probability for important features, lower selected risk for noise features, more true positives, and fewer false positives. Finally, the above conclusions are verified by the simulated and real data experiments.
0 Replies

Loading