Abstract: Feature selection is often necessary when implementing classifiers in practice. Most approaches to feature selection are motivated by the curse of dimensionality, but few seek to mitigate the overall computational cost of feature extraction. In this work, we propose a model-based approach for addressing both objectives. The model is based around a sparse kernel machine with feature scaling parameters controlled by a beta-Bernoulli prior. The hyperparameters are controlled by each feature's computational cost. Experiments were carried out using publicly-available data sets, and the proposed Cost-Constrained Feature Optimization (CCFO) was compared to related methods in terms of accuracy and computational reduction.
Loading