Fast and Adaptive Multi-Objective Feature Selection for Classification

ICLR 2026 Conference Submission15962 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multi-objective feature selection, classification, evolutionary algorithm, adaptive KNN, initialization, mutual information
TL;DR: FIAK: fast initialization and adaptive KNN
Abstract: Identifying high-quality feature subsets for decision-makers by wrapper-based multi-objective feature selection (MOFS) has been attracting increasing attention. As a mainstream approach, evolutionary methods offer distinct advantages but also struggle with increasingly complex application scenarios and high-dimensional data, mainly in terms of time efficiency, exponentially expanding search space, and adaptive determination of a suitable classifier. To overcome these challenges, this paper proposes two simple yet effective methods: Fast Initialization (FI) and one-generation Adaptive K-Nearest Neighbor (AK). FI leverages mutual information and tournament selection to locate high-quality initial feature subsets computationally efficiently. AK verifies that, with theoretical proof, using a single generation can determine the most suitable KNN for different data to improve feature selection performance with very little time overhead and without any data analysis or assumption. Experiments on 20 real-world high-dimensional datasets demonstrate the superior performance of FI and AK to advanced initialization and KNN methods for MOFS. We also validated that the obtained feature subsets generalize well to an LLM for tabular data, enabling it to be seamlessly applied to high-dimensional data and achieve superior performance.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 15962
Loading