Abstract: Feature selection is a core problem in machine learning. It plays an important role in making efficient and explainable machine-driven decisions. Embedded feature selection methods, such as decision trees and LASSO, suffer from learner dependency and cannot be applied well to many popular learners. Wrapper methods, which fit arbitrary learning models, are receiving growing interests in many scientific fields. In order to effectively search relevant features in wrapper methods, many randomized schemes have been proposed. In this paper, we present efficient randomized feature selection algorithms empowered by automatic breadth searching and attention searching adjustments. Our schemes are generic and highly parallelizable in nature and can be easily applied to many related algorithms. Theoretical analysis proves the efficiency of our algorithms. Extensive experiments on synthetic and real dataset show that our techniques achieve significant improvements in the selected features' quality and selection time.
0 Replies
Loading