Unsupervised Feature Selection using a Basis of Feature Space and Self-Representation Learning

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Unsupervised Feature Selection, Self-Representation Learning, Graph Regularization, Sparse Subspace Learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: Novel representation learning method, Graph Regularized Self-Representation and Sparse Subspace Learning (GRSSLFS), is proposed for the unsupervised feature selection.
Abstract: In recent years, there has been extensive research into unsupervised feature selection methods based on self-representation. However, there exists a major gap in the mathematical principles that underlie these approaches and their capacity to represent the feature space. In this paper, a novel representation learning method, Graph Regularized Self-Representation and Sparse Subspace Learning (GRSSLFS), is proposed for the unsupervised feature selection. Firstly, GRSSLFS expresses the self-representation problem based on the concept of ``a basis of feature space'' to represent the original feature space as a low-dimensional space made of linearly independent features. Furthermore, the manifold structure corresponding to the newly constructed subspace is learned in order to preserve the geometric structure of the feature vectors. Secondly, the objective function of GRSSLFS is developed based on a self-representation framework that combines subspace learning and matrix factorization of the basis matrix. Finally, the effectiveness of GRSSLFS is explored through experiments on widely-used datasets. Results show that GRSSLFS achieves a high level of performance in comparison with several classic and state-of-the-art feature selection methods.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2619
Loading