Unsupervised Feature Selection towards Pattern Discrimination Power

Published: 26 Apr 2024, Last Modified: 15 Jul 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Unsupervised Feature Selection, Information Entropy, Entropy Approximation, Pattern Discrimination Power
TL;DR: We propose entropy-based unsupervised feature election for maximing pattern discrimination power of selected feature subset.
Abstract: The goal of unsupervised feature selection is to identify a feature subset based on the intrinsic characteristics of a given dataset without user-guided information such as class variables. To achieve this, score functions based on information measures can be used to identify essential features. The major research direction of conventional information-theoretic unsupervised feature selection is to minimize the entropy of the final feature subset. Although the opposite way, i.e., maximization of the joint entropy, can also lead to novel insights, studies in this direction are rare. For example, in the field of information retrieval, selected features that maximize the joint entropy of a feature subset can be effective discriminators for reaching the target tuple in the database. Thus, in this work, we first demonstrate how two feature subsets, each obtained by minimizing/maximizing the joint entropy, respectively, are different based on a toy dataset. By comparing these two feature subsets, we show that the maximization of the joint entropy enhances the pattern discrimination power of the feature subset. Then, we derive a score function by remedying joint entropy calculation; high-dimensional joint entropy calculation is circumvented by using the low-order approximation. The experimental results on 30 public datasets indicate that the proposed method yields superior performance in terms of pattern discrimination power-related measures.
List Of Authors: Seo, Wangduk and Lee, Jaesung
Latex Source Code: zip
Signed License Agreement: pdf
Submission Number: 525
Loading